Build engaged audiences through publishing by curation.
Sign up with Facebook
Sign up with Twitter
I don't have a Facebook or a Twitter account
Start a free trial of Scoop.it Business
Phil Zimmermann, the creator of the widely used Pretty Good Privacy e-mail encryption software, recently unveiled Silent Circle, which adds security features to phone, video and text messages sent by smartphones.
Are you sure you want to delete this scoop?
Next year, the University of Virginia will begin classes in its first degree in data science, a graduate program that could help train people to analyze data for corporations, scientific researchers and the federal government.
It’s a growing field, especially in health care, where collection of patient data can help researchers study entire populations for risks and trends.
But many data scientists could wind up combing through consumer data to help, say, social media websites such as Facebook do targeted advertising. That could send those graduates into a field lacking professional codes of ethics specific to the work they do.
It is pretty obvious that the privacy profession is changing fast.Once the realm of an elite of nerdy specialists, the profession is opening up to include a whole range of professionals with a variety of talents, training and skill sets. And whilst the complexity of the challenges faced by those with responsibility for managing information, protecting data and safeguarding individual privacy remains as high as in the early days, the implications of addressing those challenges correctly are becoming exponentially greater. If we succeed, we will not only have contributed to the prosperity of future generations, but we will have also done our bit to preserve everyone's freedom.
The Department of Education recently ramped up the pressure on school districts, schools, and higher education institutions to reform their procedures for student data outsourcing, releasing a fourteen-page guidance document on 25 February that reinforces the obligation to comply with privacy laws when using a service provider to host or process student data. By issuing the guidance, the department has put entities covered by student privacy laws on notice of its expectations regarding their responsibilities when entering into these arrangements. Service providers who store and process student data on behalf of school districts and schools should therefore carefully consider the guidance and how it may affect the market for their services and the contractual demands from their education customers.The guidance is the latest in a series of events that has shone a spotlight on educational use of data processing vendors. Back in October, a Colorado superintendent made the New York Times when she faced stiff opposition from parents and school board members to the district’s retention of an online records management vendor that would have resulted in a shift of student records to the vendor’s servers. The next month, after the election of a new school board opposed to the use of the vendor, the superintendent announced her retirement — and on the same night, the board voted to scrap the long-debated vendor relationship.
Department of Education's PTAC recently released a 14-page guidance document: Protecting Student Privacy While Using Online Educational Services, http://ptac.ed.gov/document/protecting-student-privacy-while-using-online-educational-services
Shortly after receiving the IAPP’s 2014 Leadership Award at this year’s Global Privacy Summit, Federal Trade Commissioner Julie Brill sat down to discuss the agency’s priorities moving forward. Among the most pressing challenges for Brill and the agency are the effects emerging technology is having on the Fair Information Practice Principles paradigm, ensuring that organizations apply robust data security measures and assuaging international concerns about the data collection practices of U.S. government and business.
Speaking in front of a packed room of privacy professionals, Brill applauded their efforts. “You are all on the front lines,” she said, “and we’re in the same endeavor.”
At the IAPP Global Privacy Summit, the IAPP and AvePoint announced the release of a new free privacy impact assessment tool that will allow privacy professionals to better organize PIAs, involve other departments in the organization and complete PIAs more rapidly. Available from the front page of the IAPP’s Resource Center and called the AvePoint Privacy Impact Assessment system, or APIA, it is a piece of software organizations can install on their own servers, which is then accessible through a standard web browser. It allows privacy professionals to assign roles, track progress, offer up different questions for types of products and services and has many other advantages over the standard Word- or Excel-based systems currently in place.
As the field of privacy has developed, solutions to privacy concerns have multiplied in the marketplace. Tech vendors, service providers, consultants, law firms—all have broadened and deepened the offerings they have for privacy professionals to purchase in governing data, and data is becoming a company’s most valuable asset.
Privacy is a dynamic industry that has moved quickly, so quickly that few have stopped to take stock in how far the industry has come, and perhaps more importantly, what the industry has become. The IAPP Industry of Privacy Study seeks to do just that. This study is a comprehensive survey that will cover the entire industry of privacy and inventory all of the solutions and services available; and categorize those solutions and services into meaningful segments that are useful to privacy professionals. This project will allow privacy pros to better understand their organizational maturity and risk profile. Further, it will provide IAPP members with tools to engage a broader audience of influencers who are making spending decisions on privacy-related solutions and services within the enterprise. The industry of privacy deserves to be documented and understood in its entirety. This is still a work in progress but we would like to share some of our early insights to give you a taste of what’s to come.
Richard Clarke’s short but very interesting keynote focused on his takeaways from Snowdon and the NSA spying and his top 10 observations in the forty-six recommendations he and his team made about US intelligence gathering.
My daughter needed a little bit of prodding to pick which colleges she wanted to tour over spring break. When she showed me a list of universities ranked by a tool offered through her school, her boyfriend warned, "Be careful with those ranking websites." I waited for him to tell us that rankings don’t measure individual fit or other things that really matter. But then he said something unexpected: "Because if a college knows you really want to come, they’ll give you less financial aid." Whoa...What? Where did he hear that?
The Massachusetts Institute of Technology is still trying to figure out how to answer criticism of its response to the controversial federal prosecution of Aaron Swartz, the hacker and activist who was arrested on the MIT campus in 2011.
On Thursday university officials charged with reviewing MIT’s existing policies and practices flagged several ways the university could do more to protect digital privacy and encourage open-access publishing, according to an update from MIT’s news office.
Following a public comment period, the Federal Trade Commission has approved the kidSAFE Seal Program as a safe harbor program under the Children’s Online Privacy Protection Act (COPPA) and the agency’s COPPA Rule.
The Commission’s COPPA Rule requires operators of online sites and services directed at children under the age of 13 to provide notice and obtain permission from a child’s parents before collecting personal information from that child. The COPPA safe harbor provision provides flexibility and promotes efficiency in complying with the Act by encouraging industry members or groups to develop their own COPPA oversight programs.
The COPPA law also directs the Commission to review and approve self-regulatory program guidelines that would serve as safe harbors. Website operators that participate in a COPPA safe harbor program will, in most circumstances, be subject to the review and disciplinary procedures provided in the safe harbor's guidelines in lieu of formal FTC investigation and law enforcement.
Recommendations for creating a foundation for data privacy.
Yet another bill to create a federal requirement for data breach notification has been introduced, this time by Democratic leaders of the Senate Commerce, Science and Transportation Committee.
The Data Security and Breach Notification Act of 2014 would, for the first time, provide a federal standard for companies to safeguard consumers' personal information throughout their systems and to quickly notify consumers if those systems are breached.
The legislation, introduced Jan. 30 by Committee Chairman Jay Rockefeller, D-W.V., and three co-sponsors, would require the Federal Trade Commission to issue security standards for companies that hold consumers' personal and financial information. In the event of a data breach, companies would be obligated in most instances to notify their affected customers within 30 days of a breach so they can take steps to protect themselves from the risk of identity theftand fraud.
A key challenge for any organization is balancing the protection of institutional data, respecting privacy and enabling trust, when employees access institutional systems with personally owned devices. Any BYOD strategy should address this balance. Personally owned devices usually are not under the control of the institution, and verifying that the devices are securely configured can feel intrusive. Allowing personal devices that are not checked for secure configuration and vulnerabilities to log into protected systems creates potentially serious and unknown risks. Institutional attempts to influence or cause configuration changes on personally owned assets and scanning them for vulnerabilities raises questions about trust and liability.
Institutions that provide employees properly configured mobile devices help reduce the need of employees to access institutional systems with personally owned devices, but this approach does not work in all situations. While the potential cost of a security breach can easily exceed the cost of providing mobile devices to employees, the cost of providing the mobile devices also can exceed available funding. Institutionally issued mobile devices may not address all legitimate needs.
Google is in hot water for scanning millions of students' email messages and allegedly building "surreptitious" profiles to target advertising at them.
According to Education Week, a "potentially explosive" lawsuit is wending its way through US federal court, now being heard in the US District Court for the Northern District of California.
In court filings, plaintiffs charge that Google data-mines Gmail users - a group that includes students who use the company's Apps for Education tool suite.
"Stanford research shows even when offering up metadata, it's very revealing."
Since November 2013, researchers at Stanford University have been asking: What’s in your metadata?
Specifically, the study encouraged volunteers who also used Facebook to install an app called MetaPhone on their Android phones. The app was designed to act as a sort of slimmed-down version of the National Security Agency by attempting to gather the same metadata collected by telecom firms, and in turn, intelligence agencies. Volunteers who chose to participate allowed the researchers access to their calling and texting data, the date and time, and the duration of the call.
Since late last year, the team has been releasing interim results from the 546 people that chose to participate. On Wednesday, the team released its latest and most complete findings and was startled by what it found.
How big data could create an inescapable "permanent record"
Arizona State University, like many colleges across the United States, has a problem with students who enter their freshman year ill prepared in math. Though the school offers remedial classes, one-third of students earn less than a C, a key predictor that they will leave before getting a degree. To improve the dismal situation, ASU turned to adaptive-learning software by Knewton, a prominent edtech company. The result: Pass rates zipped up from 64% to 75% between 2009 and 2011, and dropout rates were cut in half.
But imagine the underside to this seeming success story. What if the data collected by the software never disappeared and the fact that one had needed to take remedial classes became part of a student’s permanent record, accessible decades later? Consider if the technical system made predictions that tried to improve the school’s success rate not by pushing students to excel, but by pushing them out, in order to inflate the overall grade average of students who remained.
These sorts of scenarios are extremely possible. Some educational reformers advocate for “digital backpacks” that would have students carry their electronic transcripts with them throughout their schooling. And adaptive-learning algorithms are a spooky art. Khan Academy’s “dean of analytics,” Jace Kohlmeier, raises a conundrum with “domain learning curves” to identify what students know. “We could raise the average accuracy for the more experienced end of a learning curve just by frustrating weaker learners early on and causing them to quit,” he explains, “but that hardly seems like the thing to do!”
The risks to student privacy are growing as more information on young people is being collected and stored.
The growing use of technology has allowed for the collection of mass amounts of data on students. Control over personal information has been lost by students and the risks to student privacy have risen dramatically. In this post, Khaliah Barnes, director of the Student Privacy Project and administrative law counsel for the non-profit Electronic Privacy Information Center, lays out a Student Privacy Bill of Rights that gives back to students control over information about their lives.
In Dragnet Nation, Julia Angwin describes an oppressive blanket of electronic data surveillance. "There's a price you pay for living in the modern world," she says. "You have to share your data."
In this modern world, there is no way to stay off the grid unless you abandon all your technology - and I mean all of it!
Google Inc., fighting claims that it illegally scanned private e-mail messages, argues it shouldn’t have to face a single lawsuit that lumps together hundreds of millions of Internet users.
Related articles from The Berkeley Blog: http://blogs.berkeley.edu/2014/03/01/bmail-and-googles-content-one-box/ and SaveGov.org: http://safegov.org/2014/1/31/google-admits-data-mining-student-emails-in-its-free-education-apps
The last two blogs were about what should be done. This one is about some progressive initiatives. In terms of national policy, the Snowden disclosures have re-opened an important conversation about electronic surveillance laws. We are all in charge of keeping that conversation going to the very least conclusion of updating privacy laws such as the Family Education Rights Privacy Act, Computer Fraud and Abuse Act of 1986; the Electronic Communications Privacy Act also of 1986; the U.S.A.-Patriot Act of 2001 and the Foreign Intelligence Surveillance Act, originally of 1978, updated in 2008, but evidently in need of further revision to balance civil rights and national security.
To mark Data Privacy Month, the University of Pennsylvania and the National Constitution Center hosted a Town Hall program with some of the nation's leading experts on privacy and surveillance. On February 3, 2014, Peter Swire of the White House NSA Review Board, Anita Allen of the University of Pennsylvania, and Charlie Savage of the New York Times joined Constitution Center's Jeffrey Rosen to discuss the NSA and government surveillance past and future. University of Pennsylvania faculty, staff, and students, as well as members of the public, were invited to participate in this free event.
If you could not attend the discussion in person, a video recording is now publicly available. Please fee free to share this resource on your campus in order to continue the privacy dialog with your colleagues.
Like many of you, I have been told repeatedly that “privacy is dead.” Most recently, I was walking down the hall in my office building, carrying my Ultrabook with the Future of Privacy Forum’s “I (heart) privacy” sticker on it, and minding my own business. A marketing colleague stopped me and abruptly advised me that “the thing you love is dead.”
Good heavens. For a minute I panicked. What thing? Cuban sandwiches? My cat? Cowboy boots? What? He pointed to my sticker and said, “Privacy is dead!”
Oh, that. No sir, it is not dead.
I am a big fan of zombie movies, and I can tell you that privacy is not dead. At worst, it is the living dead. The undead. Perhaps like Frankenstein’s monster, you thought it was dead, but in fact, it’s aliiiiive!
Recent stories about smart fridges being hacked, cars knowing our intimate secrets and energy companies predicting what we are having for dinner—OK, I made that one up—highlight the fascinating challenges that the Internet of Things (IoT) is set to bring. More fascinating, however, is the fact that addressing and successfully dealing with these challenges in a way that the opportunities are fully realised at the same time that our privacy is properly safeguarded rests with today's and tomorrow's privacy professionals.
Data Privacy Month 2014 Guest blogger Mike Corn:
"Within the privacy community it is commonly said that privacy is tightly coupled to societal notions of respect. We advocate for our local, national, and international institutions to protect personal information, to collect only the minimum needed, and to do so not merely to prevent financial loss or compliance with regulations, but because it demonstrates respect for individuals.
But what is the basis for this respect? We show respect for one another's feelings, we respect an individual's rights, and when we confront people in moments of great suffering or joy, we show respect for their privacy — we allow individuals the right to decide whether or not to share with us.
This is the point I want to focus on: By respecting individual privacy, we protect each person's right to choose whom they wish to speak with, to assemble with, and to worship with. Basic human rights codified in the first amendment to the Constitution of the United States of America. By looking at privacy through this lens, we change the color of the conversation, raising the bar quite a bit higher than compliance with the red flag rule or protection from identity theft."
A team of researchers has developed an Android app to help people better understand when their location is being accessed, something that happens more often than people think.
Android phones display a flashing GPS icon when apps are trying to access the user's location. But few people notice or understand what the icon is telling them, the researchers found.
The app they developed is designed to fix that, by making it clearer to users when other apps are accessing their location data. They tried several methods, including a message that flashes on the device's screen reading, "Your location is being accessed by [app name]."