New Google Results for “Bail Bonds Near Me”

Bail Bonds Near Me

Google Bans Ads for Bail Bonds Companies

As the bail reform debate continues, Google has decided to unequivocally pick a side. As of July 2018, when a Google search previously returned many paid results, or Google Ads, the only results now being returned are organic search results, and local listings because Google has decided not to allow bail bonds companies to advertise on its search engine.

Google Standing Up for Bail Reform - or Advocating for Technology?

The advertising ban took effect in July 2018 when Google said its aim was to protect its users from “deceptive or harmful products.”  Google, a company who's business IS algorithms would rather an algorithm determine if a defendant stays in jail or goes free while awaiting their trial. The current bail system basically includes a defendant being arrested and a judge setting bail (or bail being automatically determined by the type of crime). If the defendant can not afford to post bail, they would work with a bail bonds company to acquire a bail bond. Once bail is posted the defendant is free to go back to their life, their family, their job, etc. until their court date, when a judge or jury hears about the crime and determines their guilt and sentence, if applicable. The bail bondsman also ensures the defendant shows up to court to stand trial for their accused crimes.

Google Prefers an Algorithm to Bail Bondsman

Google announced a partnership with the Koch brothers to push for changes to the bail system. Mark Holden, of Koch Industries, confirmed that “our desired state is that after people are arrested, there should be a risk assessment done, a determination if they are a risk to public safety,” and then a decision on whether they should have to pay bail and how much it should be.

The risk assessment Google and the Koch Industries favor over the bail bonds system is an algorithm meant to assess the risk of releasing the individual and expecting them to return for court.  Doesn't it raise any eyebrows that Google's business IS algorithms? Wouldn't Google benefit from more decisions being based on algorithms?

The aim of these algorithms is to give judges another tool to help predict who will return to court for their future proceedings and who will not; who will commit a serious new crime while awaiting their court date, and who will abide by their pretrial conditions; who will turn their life around, and who will struggle while released.

Putting Bail Bondsmen Out of Business While Algorithm Keeps People of Color in Jail

Sure, as a bail bonds company it's impossible to read the bail reform news and not feel threatened. At ABC Bail Bonds we have spent decades providing bail bonds for defendants around the nation, concentrating on the areas where we serve the community out of our local offices in Cleveland, Toledo and Columbus. And if you know any bail bondsman or bail bonds companies, we are not the 1-percenters, the extremely wealthy benefiting off the less privileged. The bail bonds industry is hard work. Our clientele are criminals and those being accused of crimes. They are understandable upset about their situation and see the bail bonds process as an expense they are not happy to pay.

Bail bonds companies make a business dealing with the population that is getting locked-up. Some are wrongly accused and some are criminals. It isn't up to the bail bonds company to discriminate. It's our job to post the bond and get the defendant to court. We work with defendants, their families, significant others, and whoever is helping to get the bail together so the defendant doesn't have to wait in jail. It isn't easy work. An arrest and the process of going through the court system is frustrating and financially debilitating for our clients.  However, many of them would not have the option of getting out of jail if the system relied on an algorithm.

Bail Reform Benefits Technology Industry at Detriment to Bail Bonds Companies.

In short, bail reform is likely to benefit the software companies creating the risk assessment algorithms while the bail bondsman, bounty hunters and their support staff are put out of work. What's most disturbing is the 1-percenters who stand to benefit from the wide-spread adoption of predictive algorithms are using a platform they don't believe in to push their cause. Google and the Koch brothers don't care about people being locked up because they can't afford bail. Between Google and the Koch Industries, they could run their own algorithm and post bail for all those people their technology determines are low-risk, if they really cared about the people in jail.

So why is Google getting involved? Why are the Koch brothers advocating for bail reform. Is it because they want to help all the innocent, impoverished accused awaiting their trials? Why is Google removing all paid advertising for bail bonds companies? We wonder if it's because they want to push through technology, gather more big data, and come up with more technological advances based on predictive algorithms.

 

Bail Reform Algorithms: Racially Biased, Perpetuate Same Injustice

Many articles have been written regarding the injustices of a technology-driven, algorithm-based, system that determines who goes free and who remains detained while awaiting trial.

One State's Bail Reform Exposes the Promise and Pitfalls of Tech-Driven Justice discusses bias in the algorithms, lack of transparency in the factors affecting the risk-based score and over-reliance on the risk assessment algorithm allowing dangerous criminals to be set free and commit more crimes.

The Bail-Reform Tool That Activists Want Abolished reviews how these algorithms perpetuate racial discrimination and how more than 100 civil-rights groups, including the ACLU, signed a statement of concern urging jurisdictions to stop using the risk assessment algorithms.

“My concern [about using the risk assessment algorithms] is that what you could have is essentially racial profiling 2.0,” said Vincent Southerland, the executive director of the Center on Race, Inequality, and the Law. “We’re forecasting what some individuals may do based on what groups they’re associated with have done in the past.”

In Weapons of Math Destruction author Cathy O’Neil reveals the models being used today are opaque, unregulated, and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination.

Civil Rights Groups Call for Reforms on Use of Algorithms to Determine Bail Risk summarize how a big coalition of civil rights groups are calling for jurisdictions across the country to drastically change how they use risk assessment algorithms in determining which people to lock up.