Cars are getting smarter. Some can show you a video of what is behind you to help you park in a tight spot. Others can automatically apply the brakes if you are about to run into the car in front of you.
Now cars have a new power. They can snitch to an insurance company about your driving. A tracking device can be installed in your car to monitor how and when and how far you drive. Progressive and other insurers offer discounts on car insurance to drivers based on data from such devices. Do you accelerate sharply, corner too closely, travel at night or drive great distances? Those traits can be used against you and prevent you from getting a discount. But many of those factors are beyond your control. If your job requires you to work in the evening, why should you be penalized by your insurer? Most insurers’ devices are installed in the data port of car, under the drivers’ side of the dashboard, which limits their use to cars sold after 1998. But the Canadian insurer Desjardins uses a mobile phone app, Ajusto, that doesn’t even need to be installed in the car. But phone apps raise additional issues. Nothing prevents an insurer from matching data from the phone driving app with other information. Nearly two-thirds of smartphone owners look up health information on their devices. What if you’ve done a Google search for the side effects of an allergy medication? The insurer might take that to mean you are using the medication while driving, despite the drug’s warnings about drowsiness. Who else will ultimately get the driving information? Will the police want to know who is driving faster than the speed limit? As a phone app, Ajusto can tap into location information. Will spouses and employers want to know where the driver has been? Already, information from toll passes has been used as evidence in criminal cases and divorce cases. If you get into an accident while using Progressive’s Snapshot device, Progressive will turn over their information about your driving style and history to the court. These programs to reward safe drivers might actually lead to more accidents. A friend who used the Progressive device heard a series of beeps from his car if he braked too quickly. The only way to avoid the beeps was to stay four car lengths behind the car in front of him, but that meant other cars were constantly swerving in front of him. It also greatly increased the chance of his being rear-ended. The tracking devices for cars are touted as a way to save you money. But the data they collect can be used against you. Progressive announced that it will start charging higher rates to drivers who volunteer to use its Snapshot device, but whose driving does not measure up. Courts can order that you turn over your driving information to someone who sues you. Tracking devices have real risks. What you might save in premiums, you’ll lose in privacy.
0 Comments
Beginning in mid-July, Chicagoans may notice decorative metal boxes appearing on downtown light poles. They may not know that the boxes will contain sophisticated data sensors that will continuously collect a stream of data on “air quality, light intensity, sound volume, heat, precipitation, and wind.” The sensors will also collect data on nearby foot traffic by counting signals from passing cell phones. According to the Chicago Tribune, project leader Charlie Catlett says the project will “give scientists the tools to make Chicago a safer, more efficient and cleaner place to live.” Catlett’s group is seeking funding to install hundreds of the sensors throughout the city. But the sensors raise issues concerning potential invasions of privacy, as well as the creation of data sets with hidden biases that may then be used to guide policy to the disadvantage of poor and elderly people and members of minority groups.
Privacy Project leaders and City officials deny that the sensors raise privacy concerns. According to Catlett, a computer scientist, the sensors will “count contact with the signal rather than record the digital address of each device, and “information collected by the sensors will not be connected to a specific device or IP address.” Brenna Berman, the city’s commissioner of information and technology, said that “privacy concerns are unfounded because no identifying data will be collected.” However, Alderman Robert Fioretti hascalled for a public hearing on the data sensors. Fioretti notes that City Council was never consulted about the plan, an Emanuel administration initiative, and states that the sensors raise “obvious invasion-of-privacy concerns.” Raising a note of skepticism about the City’s privacy assurances, Professor Fred Cate of Indiana University’s Maurer School of Law noted the difficulty of avoiding the collection of personally identifiable information, even when protections intended to prevent the collection of personal information are in place: “Almost any data that starts with an individual is going to be identifiable.” Cate’s statement accords with scientific research showing that, in practice, supposedly anonymous or anonymized data can in many casesbe reidentified with an individual. Cate also raised the question of oversight: “If you spend a million dollars wiring these boxes, and a company comes in and says ‘We’ll pay you a million dollars to collect personally identifiable information,’ what’s the oversight over those companies?” In light of the potential privacy concerns, Dean Harold Krent of IIT Chicago-Kent College of Law noted that transparency is key in Chicago’s operation of the sensors. The City must be clear about how many sensors there are and how they are used, and must ensure that the data captured by the sensors is easily accessible to public officials. Hidden Bias Jeremy Gillula, a staff technologist at the Electronic Frontier Foundation (EFF), pointed out that the proposed system may create unintentionally biased data sets. The proposed sensors will track contacts with signals from Wi-Fi and Bluetooth-enabled devices, but this will only reflect a subset of the overall foot traffic, since not all passers-by will be carrying devices with Wi-Fi or Bluetooth capabilities. In Boston, the use of a mobile app called Street Bump to track potholes in the city produced biased data because smartphone owners tended to live in wealthier areas. Similarly, many Tweets during Hurricane Sandy originated in the largely affluent borough of Manhattan, giving the impression that it was among the hardest-hit areas of the storm, while in fact lower-income, outlying areas such as Breezy Point, Coney Island and Rockaway were harder hit. These examples reflect the fact that large datasets, while seemingly objective and abstract, are “intricately linked to physical place and human culture.” As the EFF has noted, “many groups are under-represented in today’s digital world (especially the elderly, minorities, and the poor). These groups run the risk of being disadvantaged if community resources are allocated based on big data, since there may not be any data about them in the first place.” Chicago will need to carefully validate the data collected from the proposed sensors to avoid introducing similar biases into policy and planning decisions. Michael Holloway is a Legal Fellow at IIT Chicago-Kent’s Institute for Science, Law and Technology. John McElligott is a Research Assistant at the IIT Chicago-Kent Institute for Science, Law and Technology. He is currently studying Law in his second year at the IIT Chicago-Kent College of Law. Susan, a professional woman in her 30s, met a man she thought she’d ultimately marry. Their relationship was sufficiently intimate that she sent him a naked photo of herself. When she caught him cheating, she broke up with him. He took revenge by posting that selfie on a revenge porn website, along with her name, the name of her town, and her social media contact information. She received messages from complete strangers asking for more naked photos. As she went about her daily life, she was afraid that one of those men would stalk her. She worried that her co-workers might have come across the photo. She knew that if she applied for a new job, that nude photo would come up in a Google search of her name. She’d been branded with a modern Scarlet Letter. Across the Web, thousands of people attack their exes by posting disgusting comments about them, warnings not to date them, or nude photos of them. On October 1, California Governor Jerry Brown signed into law a bill criminalizing what has become known as revenge porn. The law assesses a thousand dollar fine in a narrow situation. It is a misdemeanor for a person to photograph “the intimate body part or parts of another identifiable person, under circumstances where the parties agree or understand that the image shall remain private, and the person subsequently distributes the image taken, with the intent to cause serious emotional distress, and the depicted person suffers serious emotional distress.” But the law has serious limits. The law wouldn’t help Susan because it doesn’t cover selfies; it would only apply if her boyfriend had taken the photo and then later posted it. Even when an ex-boyfriend did take a photo and post it, it would be hard for the woman to prove that their understanding was that it would remain private. Didn’t she know there was at least a chance he was going to show it to his friends? And the requirement that he must have “the intent to cause serious emotional distress” is both hard to prove and too narrow. A man might evade punishment by claiming that by posting the photo he was just trying to brag that his girlfriend was hot. Or what if they were law students competing for the same job and he said he posted it to reduce her chances of winning the job? That wouldn’t be covered by the law. And while the men who posted nude photos of their exes could be prosecuted under the law, it would provide no remedy for the women who want to get their photos removed from the web. Nude photos posted on one revenge porn site are often re-posted on dozens of other sites. A particular ugly or revealing photo might be replicated in hundreds of places on the Web. A state law, such as that in California, can’t reach the main offenders— the websites that host revenge porn. A federal law adopted in the infancy of the Web, Section 230 of the Communications Decency Act, says that interactive computer services are immune from the types of suits for defamation and invasion of privacy that can be brought against traditional publishers. That makes sense with providers such as Comcast and websites such as Facebook (why should they be sued if I defame you in an email or post?), but it doesn’t make sense to grant immunity to websites whose sole purpose is to defame or invade privacy. It’s time to strip those websites of the ability to digitally gang rape women whose photos they post. On revenge porn websites, the posting is just the beginning. Hunter Moore used to run a website, Is Anybody Up, where other men would write savage comments about the ugliness or sluttiness of the women in the photos. (“No sex with her unless she had a bag over her head” is one of the milder comments.) The more hits Moore’s site got, the more money he made through ads. “Hate can be monetized,” wrote Kelly Bourdet of Vice. Hunter Moore told the Village Voice how much he’d benefit if someone killed herself because of his posting her nude photo and comments about her: “So if someone fucking killed themselves? Do you know how much hate I’d get? All the Googling, all the redirects, all, like, the press…” As I advocate in my book I Know Who You Are and I Saw What You Did: Social Networks and the Death of Privacy, we need to revamp Section 230 to allow people to sue the revenge porn websites for defamation and invasion of privacy and to grant people the right to have their photos removed. The rationale for protecting internet service providers (that they shouldn’t have a duty to police transmissions to see if people are defaming each other) should not apply to protect websites whose whole business model is to defame and harass. Women like Susan should have a right to have her nude photo—intended for an audience of one—to be removed from a website that is exposing her to the world. Did you know that key features of your smartphone—its camera, microphone, and its ability to connect to the Internet—can be surreptiously used against you? Read my blog about it on Time.com.
|
Lori AndrewsLori is a law professor and the author of I KNOW WHO YOU ARE AND I SAW WHAT YOU DID: SOCIAL NETWORKS AND THE DEATH OF PRIVACY. Sign up for Lori's newsletter.
Archives
April 2015
Categories
All
|