Volume 13, Issue 3

US Security Experts Fear 'Cybergeddon'

Agence France Presse (01/07/09)

Shawn Henry, cyber division assistant director at the U.S. Federal Bureau of Investigation (FBI), says that beyond weapons of mass destruction, cyberattacks pose the greatest threat to the United States. U.S. experts warn of a "cybergeddon" in which an advanced society that has most of its major infrastructure systems linked to or completely controlled by computers is sabotaged by hackers. Henry says terrorist groups are working to create a virtual 9/11 that would inflict the same kind of damage to the U.S. as the 9/11 attacks did. Last year, Russian hackers allegedly launched a major offensive against Internet networks in Estonia and Georgia, and Palestinian sympathizers have coordinated attacks against hundreds of Israeli Web sites over the past few days. "We're seeing that the folks on the cutting edge of this tend to be the bad guys," says the FBI's Donald Codling. "It's extraordinarily difficult for us to catch them." The FBI's Christopher Painter says cyberattacks are particularly dangerous because the threat is largely invisible and not always taken seriously as a result. "It's hard to get your head around the threat," Painter says. "We often discover a company has been attacked and we tell them that and they don't know."

View Full Article | Return to Top

Schools Tap '21st-Century Skills'

Christian Science Monitor (01/08/09) P. 3; Khadaroo, Stacy Teicher

As the economy becomes more knowledge-oriented, so will the need become more pressing for students to possess "21st-century skills" such as problem solving, critical thinking, innovation, and cross-cultural collaboration. Schools will have to cultivate these skills without short-changing students in terms of reading, writing, and math. West Virginia state superintendent Steven Paine observes that 21st-century learning has the potential to integrate academic and real-world training, if education can overcome its tendency to avoid applying knowledge "in real, contextual situations." West Virginia is revising teacher-preparation courses and offering professional development for educators already within the system. Such offerings include a comprehensive Web site and workshops hosted by technology businesses. Many schools want to confer technological literacy, which frequently demands a computer upgrade or improved coaching for teachers. Ann Flynn with the National School Boards Association says lessons employing technology must focus beyond the "wow" factor. Ken Kay with the Partnership for 21st Century Skills argues that the United States needs "a system built around the idea that every kid needs to be able to critically think and problem-solve."

View Full Article | Return to Top

MIT Professor Creates Software to Organize the Details of Everyday Life

Campus Technology (01/05/09) Schaffhauser, Dian

The computer can be a better tool for creating to-do lists and jotting down other information, says Massachusetts Institute of Technology (MIT) professor David Karger. Karger, a member of the MIT Computer Science and Artificial Intelligence Lab, has created List.it, Web-based note-taking software that makes it easier for people to write down short notes and find them later. People ultimately will spend less time entering, storing, and retrieving information, whether email addresses, Web URLs, or shopping lists, using List.it, Karger says. List.it is available on the Firefox browser sidebar. Users can enter information on the fly via the quick input box. A synching feature provides back up for notes, and installing List.it on multiple computers mirrors notes to all of the machines. "I would never make the claim that we're trying to replace Post-its," says Michael Bernstein, a graduate student in Karger's lab. "We want to understand the classes of things people do with Post-its and see if we can help users do more of what they wanted to do in the first place."

View Full Article | Return to Top

What Will Change Everything? Ask a Computer Scientist

ITworldcanada.com (01/06/09) Schick, Shane

John Brockman's Edge.org Web site recently posed the question "What will change everything?" to a group of academics. The answer for computer scientist Roger Schank is a machine that provides knowledge as needed. Schank says information in enterprise databases or on personal computers should find us, rather than having people constantly search for it. Schank views information as stories rather than content, and envisions a future of just-in-time storytelling. "To put this another way, an archive of key strategic ideas about how to achieve goals under certain conditions is just the right resource to be interacting with enabling a good story to pop up when you need it," Schank says. He says goal-directed indexing is about organizing information so that it can be cross-referenced the next time an example of what users need comes up, and in the context of a story that users will understand or remember. Schank says researchers should begin to focus on how to monitor user behavior so that machines can understand their goals and index information appropriately. "We will all become much more likely to profit from humanity's collective wisdom by having a computer at the ready to help us think," he says.

View Full Article | Return to Top

NSF Rethinks Its Digital Library

Science (01/02/09) Vol. 323, No. 5910, P. 54`; Mervis, Jeffrey

The National Science Foundation (NSF) has spent roughly $175 million over the last nine years "to provide organized access to high quality resources and tools that support innovations in teaching and learning at all levels" as part of its National Science, Mathematics, Engineering, and Technology Education Digital Library (NSDL) program. The program includes the creation and maintenance of a Web site with an immense diversity of peer-vetted content. It also supports many disciplinary and sector-based portals that offer appropriate material to NSDL, and finances individual projects to help educators and researchers more fully utilize e-learning. When NSDL was launched, it was envisioned as a means to help the United States use cyberspace to facilitate economic growth in the education sector by improving student performance, raising student interest in science, and making high-quality material widely accessible to students, parents, and teachers. NSF saw the need for an administrative infrastructure to make potentially useful Web content easy to find and classroom-customizable. NSF funded "core integration" groups at Columbia University, Cornell University, and the University Corporation for Atmospheric Research to run the main portal. The agency then invited the community to compile collections that would connect with NSDL, and 13 Pathway portals have been funded by NSF to serve both user communities and individual scientific disciplines. To investigate why rank-and-file academic researchers contribute to digital libraries so infrequently, NSF commissioned the University of Wisconsin, Madison, to conduct a survey. The survey found that scientists are inclined to perform their own searches and ascribe equal importance to speed and quality. Study co-author Alan Wolf notes the researchers are heavy Google users. Cognitive scientist Tamara Sumner at the University of Colorado, Boulder, is disappointed that NSDL could be better known but is not.

View Full Article | Return to Top

How to Foil 'Phishing' Scams

Scientific American (12/08) Vol. 299, No. 6, P. 104; Cranor, Lorrie Faith

Billions of dollars are lost each year to phishing emails that trick people into exposing personal or corporate information to criminals by posing as legitimate communications from trusted companies and institutions. Carnegie Mellon University's Lorrie Faith Cranor says her research group is focused on the best ways to teach people to identify and avoid such scams. She says that "this research, in turn, is informing our design of anti-phishing software so people are more likely to use it correctly." A study of existing anti-phishing training efforts discovered that they were largely ineffective due to a number of factors, including an overreliance on technical jargon, a lack of actionable advice on protection strategies, and widespread ignorance of corporate phishing attack advisories by employees and customers. Cranor says her team kept these observations in mind throughout the development of the PhishGuru training system, which provides anti-phishing information to users after they have been scammed. The system uses informative cartoons to deliver actionable advice, while a member of Cranor's team devised an online training game designed to educate players about the phishing threat and avoidance strategies. Both projects have helped reduce the likelihood that users will fall victim to phishers. However, Cranor says individual users cannot be expected to combat phishing on their own, so her group is engaged in the development of automatic filters that can spot likely phishing attacks. Research has determined that the effectiveness of these filter warnings depends on their clarity, accuracy, and timeliness, and Cranor says her group is developing programs capable of identifying phishing email through the use of machine-learning methods. One such effort is a tool called PhishPatrol that analyzes emails for possible phishing telltales, and also is trained to recognize phishing indicators using a large collection of authentic and fraudulent messages.

View Full Article | Return to Top

Billion-Point Computing for Computers

UC Davis News and Information (01/08/09) Greensfelder, Liese

Researchers at the University of California, Davis (UC Davis) and Lawrence Livermore National Laboratory have developed an algorithm that will enable scientists to extract features and patterns from extremely large data sets. The algorithm has already been used to analyze and create images of flame surfaces, search for clusters and voids in a virtual universe experiment, and identify and track pockets of fluid in a simulated mixing of two fluids, which generated more than a billion data points on a three-dimensional grid. "What we've developed is a workable system of handling any data in any dimension," says UC Davis computer scientist Attila Gyulassy, who led the five-year development effort. "We expect this algorithm will become an integral part of a scientist's toolbox to answer questions about data." As scientific simulations have become increasingly complex, the data generated by these experiments has grown exponentially, making analyzing the data more challenging. One mathematical tool to extract and visualize useful features in data sets, called the Morse-Smale complex, has existed for nearly 40 years. The Morse-Smale complex partitions sets by similarity of features and encodes them into mathematical terms, but using it for practical applications is extremely difficult, Gyulassy says. The new algorithm divides data sets into parcels of cells and analyzes each parcel separately using the Morse-Smale complex. The results are then merged together, and as new parcels are created from merged parcels, they are analyzed and merged again. With each step, data that does not need to be stored in memory can be discarded, significantly reducing the computational power needed to run the calculations.

View Full Article | Return to Top

Analyst: Obama may spend a billion on biometrics

By Alice LipowiczJan 07, 2009

The Obama administration is likely to spend $750 million to $1 billion on biometric applications this year, primarily in defense, intelligence and homeland security, according to a new report from Jeremy Grant, an analyst for the Stanford Group Co. research firm.

Key programs at the Defense Department could result in $500 million to $600 million in biometrics contracts, and intelligence programs could add another $250 million to $350 million, Grant said.

Other major programs contributing to the growth include the Homeland Security Department’s U.S. Visitor and Immigrant Status Indicator Technology and Real ID Act of 2005, the FBI’s Next Generation Identification and Homeland Security Presidential Directive-12, he said.

View Full Article | Return to Top

Nine Hot Technologies for '09 (Network World, 01/05/2009)

Our annual list of hot technologies includes a few that exploded on the scene recently plus some that have been simmering for years and just now are coming into their own.

1. 802.11n: The ‘n’ stands for now

2. Unified communications: Getting warmer

3. Data protection: It’s the data, stupid

4. Green IT: A new world view

5. Network access control: After the shakeout

6. 10 Gigabit Ethernet: A switch in time

7. Virtualization: Beyond the server farm

8. Cloud computing: Proceed with caution

9. Web 2.0: Learn to live with it

View Full Article | Return to Top

Verizon service steps up analysis of security risks

Managed security services from Verizon Business get stronger risk-correlation capabilities
By Ellen Messmer , Network World , 01/08/2009

Verizon Business is boosting the vulnerability-scanning and risk-correlation capabilities in its managed security services lineup.

The risk correlation functions of Verizon's managed security services until now were generally limited to defining an incident as a high or low risk, according to Jonathan Nguyen-Duy, Verizon Business director of product management. Now, however, Verizon is boosting its risk-correlation analysis through use of security-event management (SEM) so it can provide a deeper level of detail about the confidentiality, integrity and availability of data in specific customer computer equipment.

"This involves categorization of the data," Nguyen-Duy says, noting that the enhanced Verizon vulnerability-scanning and risk-correlation service will be able to process data from customer intrusion-detection and -prevention systems, firewalls, or other security gear where event data could be picked up by Verizon's SEM engine.

View Full Article | Return to Top

Hack Simplifies Attacks On Cisco Routers

New technique for hijacking routers reinforces need for regular IOS patching
Jan 06, 2009 | 03:19 PM

By Kelly Jackson Higgins
DarkReading

A security researcher has discovered a method of hacking Cisco routers with only basic knowledge about the targeted device.

Although researchers have found various vulnerabilities in Cisco routers, exploits mostly have been focused on hacks of specific IOS router configurations, which require targeted and skilled attacks. But Felix "FX" Lindner, a researcher with Recurity Labs, demonstrated last week (PDF) at the 25th Chaos Communication Congress in Berlin a technique that lets an attacker execute code remotely on Cisco routers, regardless of their configuration.

"The bottom line is that before, all IOS exploits had to know the exact IOS image running on the target to get code execution. This is approximately a 1 to 100,000 chance, [and] a professional attacker doesn't risk his 0-day exploit when the odds are stacked against him like that," Linder says. "I [demonstrated] that at least on one group of Cisco routers, there is a way to execute code without knowing the IOS image version [they are] running."

Lindner says his exploit method is independent of a router vulnerability, and applies only to stack-buffer overflow bugs. He was able to execute memory writes and to disable CPU caches on Cisco routers running on the PowerPC CPU. Lindner hasn't yet tested his technique on larger, more expensive Cisco routers, but plans to do so eventually.

Security researcher Dan Kaminsky says FX's hack disproves conventional wisdom in enterprises that routers are at low risk of attack, and that patching them is riskier than an attack due to the potential network outages that patching can incur.

"Patching is hard. Patching something that, if there's a failure, causes widespread network outages. Having a vaguely credible reason not to patch, like 'the attacker would have to build an attack specifically targeted to my specific version of IOS,' has been something of a mantra for those who haven't wanted to risk updating their systems," says Kaminsky, who is director of penetration testing for IOActive. "The mantra is [now] visibly, irrevocably disproved. FX has shown that however much changes from one version to another, there's a little piece of every Cisco router that's the same, and he can use that piece to attack most of the hardware out there."

View Full Article | Return to Top

Data Breaches Exposed More Than 35 Million Records in 2008

By Brian Prince
2009-01-07

According to findings from the Identity Theft Resource Center, the number of reported data breaches in the United States in 2008 hit 656, nearly 50 percent more than in 2007. The organization puts the number of data records exposed at roughly 35.7 million, but concedes the actual number could be much higher.

The number of reported data breaches in the United States jumped nearly 50 percent in 2008, according to by the Identity Theft Resource Center.

All totaled, there were 656 breaches reported last year, up from 446 in 2007. While the 656 may not sound like a lot, they led to nearly 35.7 million records being exposed. More alarming, only 2.4 percent of all the data breaches had the information secured by encryption or other strong protection methods. Just 8.5 percent had the exposed data protected by passwords.

“Our sense is that two things are happening - the criminal population is stealing more data from companies and that we are hearing more about the breaches,” the ITRC said in a statement. “ITRC has been tracking breaches since 2001. One thing we absolutely can say is that [data breaches are] not a new problem.”

View Full Article | Return to Top

Hackers out bank secrets

BY CANDICE JONES , ITWEB TELECOMS EDITOR

[ Johannesburg, 7 January 2009 ] - The Competition Commission has launched a witch-hunt to prosecute several unknown people, which it believes cracked a confidential report and posted it online.

The report, 22 months in the making, is the culmination of the commission's banking competition enquiry, which started in 2006. “This is frustrating for us,” says the commission's manager of stakeholder relations, Nandi Mokoena.

The 590-page document contains some information that most of the “Big Four” banks would prefer to keep under wraps. This includes customer profiling information and other titbits the banks claim to be trade sensitive. The commission guaranteed the banks that the information presented would not be made public.

Mokoena says the commission's banking enquiry site published a censored version of the report, with the intention of making at least some of the information accessible to the consumer. Instead of removing the sensitive information, the commission used an electronic “blackout” to hide what was not for public consumption.

How it was done

According to the commission, the hackers used a software package to remove the censoring layer and published the document on a local IT news and forum site, along with instructions on how to crack these types of documents.

“We immediately contacted the site and asked them to remove the information, which they did. We also removed that version from our site, photocopied and scanned the document, so that the blacked-out areas could not be cracked,” she explains.

However, not long after that, the uncensored version appeared on Wikileaks, a US-based online “confidential” content-sharing Web site, which targets government documents.

Wikileaks was founded by Chinese dissidents, journalists, mathematicians and start-up company technologists, from various countries, including SA. It hosts thousands of leaked confidential government documents and has faced several legal battles.

The commission has requested the site remove the confidential document, however, it is still freely available and the authority's letter has now also been published. “The problem is that none of these people are actually interested in the content. The only comments that we have seen since this incident has started have been about the cracking of the file, and nothing about the report itself.”

View Full Article | Return to Top

Revealed: The Environmental Impact of Google Searches

Times Online (UK) (01/11/09) Leake, Jonathan; Woods, Richard

Harvard University physicist Alex Wissner-Gross has been researching the environmental impact of Google searching and claims that one search generates about 7 grams of CO2 emissions. "Google operates huge data centers around the world that consume a great deal of power," Wissner-Gross says. "A Google search has a definite environmental impact." Analysts estimate that Google processes more than 200 million Internet searches daily. A recent Gartner study found that the global IT industry generates as much greenhouse gas as the airline industry, or about 2 percent of the world's global CO2 emissions. A Google search is typically submitted to several servers competing against each other. Requests may even be sent to servers thousands of miles apart, returning data from the server that can produce an answer the fastest, which minimizes delays but increases energy consumption. An estimate from John Buckley, managing director of British environmental consultancy group carbonfootprint.com, puts the CO2 emissions of a Google search between 1g and 10g, depending on whether or not users need to turn on their PCs first. Running a PC generates between 40g and 80g of CO2 emissions per hour, Buckley says. British Computer Society data center expert Liam Newcombe says computer's increased energy use is acceptable as long as Web searches are replacing activities that consume more energy, such as driving to stores, but if Web searches are adding energy consumption that would not otherwise happen, there may be a problem.

View Full Article | Return to Top

Supercomputing Helps UC San Diego Researchers Visualize Cultural Patterns

UCSD News (01/09/09) Ramsey, Doug

The Department of Energy (DOE) and the National Endowment for the Humanities (NEH) have awarded 330,000 hours of supercomputer time to the University of California, San Diego (UCSD) Software Studies Initiative for use in its Visualizing Patterns in Databases of Cultural Images and Video project. The time grant is one of three inaugural awards from the Humanities High Performance Computing Program, recently created by the DOE and NEH. "Digitization of media collections, the development of Web 2.0, and the rapid growth of social media have created unique opportunities to study social and cultural processes in new ways," says Software Studies Initiative director and principal investigator Lev Manovich. "For the first time in human history, we have access to unprecedented amounts of data about people's cultural behavior and preferences as well as cultural assets in digital form. This grant guarantees that we'll be able to process that data and extract real meaning from all of that information." The project will process millions of public domain images, paintings, professional photographs, graphic designs, videos, feature films, animations, music videos, and user-generated photos and videos. Manovich has been developing the Software Studies Initiative framework, which uses interactive visualization, data mining, and statistical data analysis to research, teach, and present cultural artifacts, processes, and flows. The project also aims to use cultural information on the Web to create detailed and interactive spatio-temporal maps of contemporary global cultural patterns. Algorithms will be used to extract image features and structure from images and video to create metadata that can be analyzed using statistical techniques.

View Full Article | Return to Top

Teens Prepared for Math, Science Careers, Yet Lack Mentors

MIT News (01/07/09)

U.S. adolescents are eagerly and positively embracing the subjects of science, technology, engineering, and mathematics (STEM), but there is a pronounced lack of encouragement from mentors and role models, as indicated by the 2009 Lemelson-MIT Invention index. Eighty-five percent of teens polled in the study expressed interest in STEM, and 44 percent said their interest was fueled by "curiosity about the way things work." Most of the interested teens said a sense of altruism rather than materialism would motivate them to pursue careers in related fields. Eighty percent of respondents believe their schools have adequately prepared them for such careers, and the stereotypical view of scientists, engineers, and mathematicians as geeky was held by only 5 percent of teens. "Increased exposure to STEM through hands-on learning and interaction with teachers and professionals in these fields may be partly responsible for this positive shift in teens' perceptions," says Lemelson-MIT Program invention education officer Leigh Estabrooks. However, almost 66 percent of survey respondents said they may be discouraged from pursuing a STEM career because they do not know anyone who is employed in these fields or understand what people in these fields do. The Lemelson-MIT InvenTeam effort establishes teams of high school students, teachers, and mentors that receive grants to solve real-world problems through technological innovation. The program is designed to get high school students excited about invention, enable students to problem solve, and nurture an inventive culture in schools and communities.

View Full Article | Return to Top

NSF Looking for Wicked Cool Visual and Data Analysis Algorithms

Network World (01/07/09)

The National Science Foundation (NSF) wants to develop highly interpretive mathematical and computational algorithms and techniques to help the U.S. government and private researchers evaluate the data generated by health care, computational biology, security, and other areas. NSF wants to make it easier for law enforcement and the intelligence community to present its data in a visual format, which will require the development of new algorithms capable of representing and transforming digital data into mathematical formulations and computational models that allow for efficient and effective visualization. NSF's research effort is part of a five-year, $3 million project known as the Foundation on Data Analysis and Visual Analytics (FODAVA), which is led by the Georgia Institute of Technology, the NSF, and the Department of Homeland Security. One FODAVA program is a Georgia Tech system known as Jigsaw, which provides multiple coordinated views of large document collections to show connections between entities found within the collection. Meanwhile, the Defense Advanced Research Projects Agency says it wants to develop software capable of capturing knowledge from naturally occurring text and transforming it into the formal representations used by artificial-intelligence reasoning systems.

View Full Article | Return to Top

Alice's Cyber-Wonderland Keeps Growing

Pittsburgh Tribune-Review (01/06/09) Heinrichs, Allison M.

Teachers from high schools, community colleges, and universities recently gathered at Carnegie Mellon University (CMU) to learn more about Alice, a free computing tool that can help generate student interest in computer science. Alice teaches students how to program by having them create three-dimensional animations and stories. Created by Carnegie Mellon University computer scientist Randy Pausch and fellow researchers, Alice is particularly geared toward attracting young women to computer science by enabling them to use programming as a storytelling method. The newest version of the program, Alice 3, incorporates characters, motions, and art from the computer game "The Sims," which was donated to CMU by Electronic Arts to give Alice a more polished and sophisticated appearance. CMU scientists are asking teachers to test Alice 3 with students, and to track any bugs in the software so they can be fixed before worldwide release. "Today's students are very savvy about video games," says CMU professor Wanda Dann, director of the Alice project. "And when they're creating their own animations, they would really like the animations to look like the video games do." CMU says about 15 percent of colleges use Alice to teach computer programming. Sun Microsystems engineer Daniel Green, who teaches middle and high school students at a computer club, says Alice also is excellent for younger students. "I've been fundamentally surprised at how young the kids are who are using this," Green says.

View Full Article | Return to Top

Top 8 Microsoft Research Projects to Improve Our Lives

Network World (01/07/09) Ashley, Mitchell

Microsoft Research has a Web site dedicated to its socio-digital systems research projects. The Digital Postcard project could eventually turn digital picture frames into devices capable of displaying holiday cards or birthday pictures and greetings from loved ones. The Epigraph project could help family members not living at home communicate by providing each family member with a space on a screen to post whatever content they feel like sharing with other family members. The Digital Shoebox and Family Archive projects aim to help users manage the increasing amount of digital data that is created from digital photos. The Shake2Talk project is exploring the use of the smart phone's vibrate feature as a means of communication, such as "feeling through your phone" when a child is putting the key in the front door when they arrive home from school, or feeling a romantic heartbeat when a significant other sends a text message. The Whereabouts Clock shows where family members are located based on their cell phone's GPS location. Finally, the Text-it-Notes and TEXT2PAPER projects focus on transferring messages between paper and digital forms without manual transcription.

View Full Article | Return to Top

P2P Traffic Control

EurekAlert (01/07/09)

Researchers at the University of California, Irvine believe that peer-to-peer style information sharing could be used to improve road conditions for drivers. Using local-area wireless technology, vehicles could form an ad hoc network to exchange information about traffic conditions, accidents, and other incidents involving the roadways. This traveler-centric, zero-infrastructure network would share traffic information based on the same concept that file sharers on the Internet use to exchange music and video files. The researchers recently tested a prototype of the system, called Autonet, which is based on 802.11b wireless technology. Autonet features a graphical in-vehicle computer client that continuously monitors other nearby clients on the wireless network and shares information about local road conditions. The researchers say that Autonet can record about 3,500 traffic incidents for two vehicles passing each other at highway speeds. Autonet also can have wireless clients other than vehicles.

View Full Article | Return to Top

CWE/SANS TOP 25 Most Dangerous Programming Errors

Experts Announce Agreement on the 25 Most Dangerous Programming Errors - And How to Fix Them
Agreement Will Change How Organizations Buy Software.
Project Manager: Bob Martin, MITRE
Questions: gro.snas|52pot#gro.snas|52pot

(January 12, 2009) Today in Washington, DC, experts from more than 30 US and international cyber security organizations jointly released the consensus list of the 25 most dangerous programming errors that lead to security bugs and that enable cyber espionage and cyber crime. Shockingly, most of these errors are not well understood by programmers; their avoidance is not widely taught by computer science programs; and their presence is frequently not tested by organizations developing software for sale.

The impact of these errors is far reaching. Just two of them led to more than 1.5 million web site security breaches during 2008 - and those breaches cascaded onto the computers of people who visited those web sites, turning their computers into zombies.

People and organizations that provided substantive input to the project are listed below. They are among the most respected security experts and they come from leading organizations ranging from Symantec and Microsoft, to DHS's National Cyber Security Division and NSA's Information Assurance Division, to OWASP and the Japanese IPA, to the University of California at Davis and Purdue University. The MITRE and the SANS Institute managed the Top 25 Errors initiative, but the impetus for this project came from the National Security Agency and financial support for MITRE's project engineers came from the US Department of Homeland Security's National Cyber Security Division. The Information Assurance Division at NSA and National Cybersecurity Division at DHS have consistently been the government leaders in working to improve the security of software purchased by the government and by the critical national infrastructure.

View Full Article | Return to Top

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License