There's a myth circulating in the race to recruit and train up cybersecurity professionals that even those without a technical background can become a cyber warrior.
With a radical shortage of skilled cybersecurity talent, experts across the cyber industry have fueled the belief that anyone, particularly transitioning military personnel, with or without a technical background can enter the in-demand field and be successful, Alan Paller, founder of the SANS Institute, told Wired Workplace.
Still, much of the challenge stems from a lack of a defined career path for cybersecurity talent, Tipton and Paller said. Perhaps most promising would be a three-step career path where new cybersecurity workers learn the foundations of technology in areas like systems administration or tech services, followed by continuous training and skills development, eventually qualifying them to move into more advanced jobs, Paller said.
In the two weeks between recent revelations that hackers stole data on students, alumni and faculty from the University of Maryland, College Park and the Johns Hopkins University, nearly 360,000 records were swiped in similar attacks at schools in Pennsylvania, Indiana and North Dakota.
Online thieves have increasingly sought sensitive or otherwise valuable data from educational institutions, experts say. Last year alone, breaches included possible exposure of 2.5 million Social Security and bank account numbers associated with an Arizona community college system, 74,000 Social Security numbers of University of Delaware students and staff, and 145,000 applications to Virginia Tech, according to the Privacy Rights Clearinghouse.
At the IAPP Global Privacy Summit, the IAPP and AvePoint announced the release of a new free privacy impact assessment tool that will allow privacy professionals to better organize PIAs, involve other departments in the organization and complete PIAs more rapidly. Available from the front page of the IAPP’s Resource Center and called the AvePoint Privacy Impact Assessment system, or APIA, it is a piece of software organizations can install on their own servers, which is then accessible through a standard web browser. It allows privacy professionals to assign roles, track progress, offer up different questions for types of products and services and has many other advantages over the standard Word- or Excel-based systems currently in place.
Richard Clarke’s short but very interesting keynote focused on his takeaways from Snowdon and the NSA spying and his top 10 observations in the forty-six recommendations he and his team made about US intelligence gathering.
IT Systems administrators working in an academic setting are often faced with the unenviable task of balancing two seemingly disparate priorities: managing and mitigating security risks, and ensuring a user experience that is intuitive, seamless and reliable. This dilemma is not a new one — Frederick M. Avolio, writing at Networkcomputing.com, notes that “security and usability are often inversely proportional.”
The unique environment of an academic institution presents its own specific set of challenges. While each organization is different, it is possible to address some general concerns that impact how users interact with their IT resources and the security issues that result. Understanding these issues is the first step towards designing systems that are user friendly without compromising security.
To mark Data Privacy Month, the University of Pennsylvania and the National Constitution Center hosted a Town Hall program with some of the nation's leading experts on privacy and surveillance. On February 3, 2014, Peter Swire of the White House NSA Review Board, Anita Allen of the University of Pennsylvania, and Charlie Savage of the New York Times joined Constitution Center's Jeffrey Rosen to discuss the NSA and government surveillance past and future. University of Pennsylvania faculty, staff, and students, as well as members of the public, were invited to participate in this free event.
If you could not attend the discussion in person, a video recording is now publicly available. Please fee free to share this resource on your campus in order to continue the privacy dialog with your colleagues.
The younger generation's desire to be connected all the time expands the attack surface. But experts say enterprises can, and should, manage the risk.
"President Bill Clinton talked about building a bridge to the new millennium. With that bridge now 14 years in the rear-view mirror, the challenge for enterprises is to build a security bridge to the Millennials who are flooding the workplace."
Yet another bill to create a federal requirement for data breach notification has been introduced, this time by Democratic leaders of the Senate Commerce, Science and Transportation Committee.
The Data Security and Breach Notification Act of 2014 would, for the first time, provide a federal standard for companies to safeguard consumers' personal information throughout their systems and to quickly notify consumers if those systems are breached.
The legislation, introduced Jan. 30 by Committee Chairman Jay Rockefeller, D-W.V., and three co-sponsors, would require the Federal Trade Commission to issue security standards for companies that hold consumers' personal and financial information. In the event of a data breach, companies would be obligated in most instances to notify their affected customers within 30 days of a breach so they can take steps to protect themselves from the risk of identity theftand fraud.
A key challenge for any organization is balancing the protection of institutional data, respecting privacy and enabling trust, when employees access institutional systems with personally owned devices. Any BYOD strategy should address this balance. Personally owned devices usually are not under the control of the institution, and verifying that the devices are securely configured can feel intrusive. Allowing personal devices that are not checked for secure configuration and vulnerabilities to log into protected systems creates potentially serious and unknown risks. Institutional attempts to influence or cause configuration changes on personally owned assets and scanning them for vulnerabilities raises questions about trust and liability.
Institutions that provide employees properly configured mobile devices help reduce the need of employees to access institutional systems with personally owned devices, but this approach does not work in all situations. While the potential cost of a security breach can easily exceed the cost of providing mobile devices to employees, the cost of providing the mobile devices also can exceed available funding. Institutionally issued mobile devices may not address all legitimate needs.
The University of Illinois at Chicago recently found itself living a modern nightmare: Google’s automated cybersecurity regime mistook the university as the culprit in a spam attack on the university’s students and began blocking university email accounts from sending messages to Gmail users.
The blocking went on for more than two weeks, and the affected Gmail users included 13,000 of the university’s own students.
On Wednesday, representatives from the University of Maryland and Target—organizations that have both suffered large data breaches in recent months—along with the Federal Trade Commission (FTC), Visa and others, testified before the Senate Commerce, Science & Transportation Committee on protecting consumer data and fighting cyberattacks.
“Security in a university is very different than the private sector because we are an open institution. There are many points of access because it is all about the free exchange of information.” President Wallace D. Loh added, “In the private sector you can centralize cybersecurity. You cannot do that at a university. So we had to find that proper balance between security and access. And that is the challenge for all universities, because, as you know, in the past 12 months, 50 universities have had major data breaches, and not all of them bothered to report it.”
Here is my prescription for creating a cyber security VoTech, extracted from a proposal I helped put together for the State of Michigan.
“In addition to working with the various certificate organizations we will work with security vendors to teach and award certifications in major security tools. This is the fastest road to creating a work force that will have immediate marketability.”
The top cyber security professionals I have worked with take one of two paths: hacking, or security operations. The hacking route involves researching malware, reverse engineering, and coding. Typically these security researchers are self taught, but there are opportunities to start at the bottom at the new breed of security intelligence vendors that is cropping up.
The other path, operational security, began with learning tools and applying them on the job. This is the path that will generate the quickest results. Just becoming proficient in configuring and maintaining a single tool can lead to marketability in cybersecurity. Hiring managers I talk to are interested in finding people that can be productive on day one. If a candidate has experience with one of the tools in use at a company they can jump right in.
The costs of a cyberattack on the University of Maryland that was made public last month will run into the millions of dollars, according to data-security professionals who work in higher education. Such a financial and reputational wallop threatens many colleges that are vulnerable to serious data breaches, experts say.
Crystal Brown, chief communications officer at Maryland, says an investigation into the theft of 309,079 student and personnel records, dating to 1998, is being led by the U.S. Secret Service. As part of its response, the university has contracted with outside forensics experts and is notifying all affected individuals. It is also providing five years’ worth of free credit-protection services to all those affected.
TED Talks can entertain and make you smarter about a lot of subjects, including technology. Here’s a look at 10 excellent tech-related talks, with brief summaries by TED regarding the talks. From thoughts on the NSA spying controversy, to an explanation of how one presenter hacked her own online dating profile...
Data stored in an insecure online location for nearly a year exposed personal information on 146,000 students and recent graduates of Indiana University, officials said on Tuesday. The lapse occurred in the registrar’s office when a data file was placed in the wrong folder; it was discovered by an employee last week.
There is no evidence that the university was the target of a cyberattack, said Bradley C. Wheeler, vice president for information technology and chief information officer for the eight-campus system. No servers or systems were hacked.
“A sophisticated computer-security attack” on the University of Maryland on Tuesday gave hackers access to more than 300,000 records of students, faculty and staff members, and others who have been issued university IDs on two of the system’s campuses since 1998.
According to a letter by Wallace D. Loh, the system’s president, experts are trying “to determine how our sophisticated, multilayered security defenses were bypassed,” and a criminal investigation is under way.
The Massachusetts Institute of Technology is still trying to figure out how to answer criticism of its response to the controversial federal prosecution of Aaron Swartz, the hacker and activist who was arrested on the MIT campus in 2011.
On Thursday university officials charged with reviewing MIT’s existing policies and practices flagged several ways the university could do more to protect digital privacy and encourage open-access publishing, according to an update from MIT’s news office.
Today the Obama Administration is announcing the launch of the Cybersecurity Framework, which is the result of a year-long private-sector led effort to develop a voluntary how-to guide for organizations in the critical infrastructure community to enhance their cybersecurity. The Framework is a key deliverable from the Executive Order on “Improving Critical Infrastructure Cybersecurity” that President Obama announced in the 2013 State of the Union.
Through the development of this Framework, industry and government are strengthening the security and resiliency of critical infrastructure in a model of public-private cooperation. Over the past year, individuals and organizations throughout the country and across the globe have provided their thoughts on the kinds of standards, best practices, and guidelines that would meaningfully improve critical infrastructure cybersecurity. The Department of Commerce's National Institute of Standards and Technology (NIST) consolidated that input into the voluntary Cybersecurity Framework that we are releasing today.
The Framework gathers existing global standards and practices to help organizations understand, communicate, and manage their cyber risks. For organizations that don’t know where to start, the Framework provides a road map. For organizations with more advanced cybersecurity, the Framework offers a way to better communicate with their CEOs and with suppliers about management of cyber risks. Organizations outside the United States may also wish use the Framework to support their own cybersecurity efforts.
Each of the Framework components (the Framework Core, Profiles, and Tiers) reinforces the connection between business drivers and cybersecurity activities. The Framework also offers guidance regarding privacy and civil liberties considerations that may result from cybersecurity activities.
Higher Ed InfoSec Council's insight:
Also of note: "Though the adoption of the Framework is voluntary, the Department of Homeland Security (DHS) has established the Critical Infrastructure Cyber Community (C3) Voluntary Program as a public-private partnership to increase awareness and use of the Cybersecurity Framework. The C3 Voluntary Program will connect companies, as well as federal, state, local, tribal, and territorial partners, to DHS and other federal government programs and resources that will assist their efforts in managing their cyber risks. Participants will be able to share lessons learned, get assistance, and learn about free tools and resources that can help them."
"Within the privacy community it is commonly said that privacy is tightly coupled to societal notions of respect. We advocate for our local, national, and international institutions to protect personal information, to collect only the minimum needed, and to do so not merely to prevent financial loss or compliance with regulations, but because it demonstrates respect for individuals.
But what is the basis for this respect? We show respect for one another's feelings, we respect an individual's rights, and when we confront people in moments of great suffering or joy, we show respect for their privacy — we allow individuals the right to decide whether or not to share with us.
This is the point I want to focus on: By respecting individual privacy, we protect each person's right to choose whom they wish to speak with, to assemble with, and to worship with. Basic human rights codified in the first amendment to the Constitution of the United States of America. By looking at privacy through this lens, we change the color of the conversation, raising the bar quite a bit higher than compliance with the red flag rule or protection from identity theft."
The year's barely started, and we've already had enough data breaches at major retailers to make a barter economy seem like a good idea. Unfortunately there are yet more security threats to look forward to in 2014. Here are the biggest ones we anticipate.