Volume 14, Issue 1

New Wireless Standard Promises Ultra-Fast Media Applications

Georgia Institute of Technology (01/22/09) Fernandez, Don

The Georgia Institute of Technology's Georgia Electronic Design Center (GEDC) has developed a complementary metal oxide semiconductor (CMOS) chip capable of transmitting 60 GHz digital radio-frequency signals. GEDC researchers say the technology could lead to the rapid transfer of high-definition movies and other large files from a PC to a cell phone, virtually wireless desktop computers and data centers, wireless home DVD systems, in-store kiosks that can download movies to mobile devices, and the ability to move gigabytes of photos or video files from a camera to a PC almost instantly. "We believe this new standard represents a major step forward," says GEDC director Joy Laskar. "Consumers could see products capable of ultra-fast short-range data transfer within two or three years." GEDC's chip provides multi-gigabit wireless transmissions by combining 60 GHz CMOS digital radio capabilities and multi-gigabit signal processing in an ultra-compact device. Laskar says the new technology represents the highest level of integration for 60 GHz wireless single-chip solutions. "Multi-gigabit technology definitely has major promise for new consumer and IT applications," says Microsoft Research's Darko Kirovski. GEDC researchers say they have already achieved high data transfer speeds that could lead to unprecedented short-range wireless speeds, including 15 Gbps at 1 meter, 10 Gbps at 2 meters, and 5 Gbps at 5 meters.

View Full Article | Return to Top

What Links Open Source and Literature?

ETH Life (01/21/09) Salzmann, Niklaus

ETH Zurich researchers have found that the mathematical distribution of Zipf's lawthe frequency of any word within a corpus of natural language is inversely proportional to its rank in the frequency tableholds true in other systems. The researchers tested their theory by examining the linking of Debian Linux software packets. They say the distribution trend can be found in different systems, including the number of visitors to Web sites, the size of towns, and the size of companies in various countries. By studying four versions of Debian Linux, the researchers determined that the number of incoming packet links obeys Zipf's law, and that the number of links referring a packet develops over time. The researchers say the ability to estimate the growth of Linux packets could have a significant effect on entrepreneurial endeavors, as well as the ability to predict the longevity of a company based on its size.

View Full Article | Return to Top

Technology Will Ease Healthcare Dilemma

University of Ulster (01/27/09)

University of Ulster researchers predict that high-tech, home-based health monitoring will be a critical part of alleviating some of the pressures the healthcare system will experience as the world's population continues to age. The researchers are developing health monitoring systems that people can use in their own homes. University of Ulster's Nanotechnology and Integrated Bioengineering Centre director Jim McLaughlin says that independent living is the motivating factor. Computer Science Research Institute at Ulster professor Chris Nugent says both patients and medical professionals can benefit from patients being more involved in the management and monitoring of their health. A new international Center for Intelligent Point of Care Sensors was launched by the University of Ulster and Dublin City University to drive research and development in the area of point-of-care sensors. Point-of-care sensors are handheld, wearable, or transportable devices for use in the home, or hospitals, to provide healthcare professionals with vital signs for analysis. McLaughlin says Ulster's focus on sensory monitoring has helped make engineering and computing a popular and highly desirable undergraduate field, a change from the trends of the past few years. "What we are doing now… is bringing the university's computing, engineering, and sensor device skills together so as to address… industry's need for electronic and highly skilled computing engineers," he says.

View Full Article | Return to Top

Scientists Use Brownian Motion to Explore How Birds Flock Together

PhysOrg.com (01/23/09) Zyga, Lisa

Scientists are researching how large groups of animals are able to move as a single body. The phenomenon, known as collection motion, could have implications for a variety of fields, including computer science and robotics. In a recent study, researchers modeled collective motion using Brownian particles, observing the interaction of individual particles making escape and pursuit movements. The researchers modeled an individual as a Brownian particle that possesses internal energy so it can move at various speeds in reaction to external stimuli. The researchers found that at high particle densities, the escape and pursuit interactions can lead to global collective motion. Pawel Romanczuk from Humbolt University Berlin says that this understanding of collective motion could have far-reaching applications. "The understanding of collective motion is of particular interest to engineers and computer scientists working on the design of autonomous robots," Romanczuk says. "The idea is that simple communicating agents may perform complex tasks as a group without the permanent control of a human for each individual, and which are also robust against the failure of individual agents within the group. He also notes that collective motion mathematical models could be used to develop "realistic computer animations of large animal swarms or even human crowds, which are also used in movie productions."

View Full Article | Return to Top

Rowan Program Aims to Promote Computer Science Among Minorities, Women

Courier Post (NJ) (01/21/09)

Rowan University's Facilitating Academic Triumph by Providing an Integrated Pipeline Experience (FATPIPE) program was established to attract more women and minority students to computer science. "Basically, I started this because we live in a technological society, and as computer science educators I feel we're ethically bound to prepare students to survive in this technological society," says Rowan professor John Robinson. FATPIPE uses several strategies to increase minority participation and help students succeed. A learning community segment called Learning in Bits and Bytes (LiBBy) helps up to 20 incoming computer science freshmen form bonds through linking courses. LiBBy requires participants to take two low-enrollment freshmen classes together with specially chosen professors, houses students together, and provides mentoring and extracurricular activities. FATPIPE also offers the Computer Science Alternate Route Program, which gives students an alternative path into computer science for those who fall just shy of meeting entrance requirements. "We take students who show potential and provide intervention to prepare them to enter the major," Robinson says. He says that once FATPIPE is more established, program coordinators will explore targeting select high schools to better prepare students for science, technology, engineering, and math majors.

View Full Article | Return to Top

A Meeting of the Minds

Waterloo Record (Canada) (01/24/09) D'Amato, Luisa

Canada's University of Waterloo next year will host the International Olympiad in Informatics, which brings together 400 of the most talented high school students from around the world to compete against each other to solve problems that will test their skills in problem analysis, algorithm design and data structures, and programming. Waterloo professor Troy Vasiga says having the students assume the flow of wind at a certain rate and a certain number of windmills, and then asking them to determine where the windmills should be positioned in order to generate the maximum amount of energy, is the type of problem participants would face. There will be teams from 80 countries at the August 2010 event. Vasiga will chair the 2010 International Olympiad in Informatics, and also will coach the Canadian team. The International Olympiad in Informatics will be held in Bulgaria this year.

View Full Article | Return to Top

Testbeds to Breed Next-Generation Systems

ICT Results (01/28/09)

European researchers working on the UNITE program have developed a virtual testbed that will enable IT developers to fine-tune new devices and systems to ensure that they interact correctly with existing systems. "Until now, when a research group wanted to test something, they often had to 're-invent the wheel,' " says UNITE's Georgios Kormentzas. UNITE developers aimed to accelerate progress within and across technologies by encouraging researchers to share their work. "If you give your testing tool to the research community, you gain access to the tools of the other teams," Kormentzas says. UNITE researchers looked for common features in software tools, hardware tools, single-layer simulators, system-simulation tools, and traffic generators that would allow them to interface with each other. They also looked for ways to integrate new tools into the UNITE platform, which consists of three main components. The first is a visual display terminal (VDT) that can be used to communicate with the virtual testbed. The VDT gives users access to the UNITE controllers and the platform's testing and simulation tools. The controllers also define and designate UNITE time slots for specific actions, such as testing a communications protocol, and maintain a database of prior simulations.

View Full Article | Return to Top

AI Comes of Age

Computerworld (01/26/09) Vol. 43, No. 4, P. 16; Anthes, Gary

Artificial intelligence (AI) research has taken great strides in recent years, proliferating and being incorporated into practical applications. Significant AI milestones include the emergence of ubiquitous computing and more powerful computers; software capable of dealing with uncertainty, incompleteness, and anomalies; algorithms that learn and improve over time; and software agents designed to weigh costs and benefits. Among the latest AI innovations is a new generation of software that integrates learning, vision, navigation, manipulation, planning, reasoning, speech, and natural-language processing. Machine learning forms the core of many present-day AI applications, and the availability of vast volumes of information from the Internet and physical sensors has fueled the technology's progress. Carnegie Mellon University professor Carlos Guestrin says that "as the amount of information increases, our ability to make good decisions may actually decrease. Machine learning and AI can help." Most AI advances have been driven by computer science rather than biology or cognitive science, although Tom Mitchell of Carnegie Mellon University's Machine Learning Department says that new brain studies could enable an unprecedented sharing of information between these disciplines. He observes, for example, that regions of the brain follow pathways predicted by reinforcement learning algorithms used in robots. "AI is actually helping us develop models for understanding what might be happening in our brains," Mitchell says.

View Full Article | Return to Top

PCs Will Become Sensitive

Financial Times Digital Business (01/28/09) P. 4; Shillingford, Joia

Over the next decade, IT will become increasingly pervasive but computers will be less prevalent as they are incorporated into vehicles, desks, and even furniture, predicts Rob Gear, manager of PA Consulting's IT Innovation Unit. Gear says semantic computing will make businesses more efficient. He predicts that by 2012 machines will be able to understand if a number is a birth date, flight number, or invoice, making businesses more productive when buying or selling online. In the public sector, highways will be outfitted with sensors and cars will use global positioning system devices to reduce bottlenecks. Individuals will carry fewer mobile devices, and computers will become more sensitive to people's emotions. Research in affective computing could advance e-learning by creating automated systems that can recognize when a student is confused and suggest an alternative lesson. Data storage and processing power will continue to become less expensive, and data storage capacities will continue to double. Nanoionic memory, which uses charged atoms to store information in nano systems, will lead to mobile devices that can hold a terabyte of data. Beyond 2050, Gear says artificial intelligence (AI) will have a major impact, leading to AI systems that solve specific problems such as credit card fraud.

View Full Article | Return to Top

A Tool to Verify Digital Records, Even as Technology Shifts

New York Times (01/27/09) P. D3; Markoff, John

University of Washington researchers have released the first component of a public system that will provide authentication for an archive of video interviews with prosecutors and other members of the International Criminal Tribunal on the Rwandan genocide, along with the first portion of the Rwandan archive. The system will be available for others to digitally preserve and authenticate first-hand accounts of war crimes, atrocities, and genocide. The tools are needed because advancements in technology have made it possible to alter digital text, video, and audio in nearly undetectable ways. The researchers say the system means the authenticity of digital documents such as videos, transcripts of personal accounts, and court records can be indisputably proved for the first time. The researchers have created a publicly available digital fingerprint, known as a cryptographic hash mark, that will make it possible for anyone to determine that the documents are authentic and have not been tampered with. The digital hash concept was first conceived by IBM's Hans Peter Luhn in the early 1950s, and the researchers are the first to attempt to simplify the application for nontechnical users and offer a complete system for long-term data preservation. Similar efforts to preserve a complete record of the World Wide Web and other documents led to computer scientist Brewster Kahle launching the Internet Archive in 1996. Another digital preservation effort was launched by Stanford University librarians in 2000. Their system, dubbed LOCKSS, for Lots of Copies Keep Stuff Safe, preserves journals by distributing copies of documents over the Internet to an international community of libraries.

View Full Article | Return to Top

Microsoft SharePoint: A Weak Link In Enterprise Security?

By Tim Wilson

SharePoint, one of the fastest-growing applications in the Windows environment, may also be turning into one of its most serious security liabilities, according to researchers and security vendors.

The SharePoint collaboration tool, which has been licensed more than 85 million times to an estimated 17,000 companies, is one of the easiest-to-use tools in the Windows suite, experts say. In fact, it's so simple that many employees and workgroups deploy it without even asking the IT department for help. But this ease of use has a price: Many IT organizations haven't properly secured their SharePoint deployments, and many others don't know what sensitive data might be stored or exchanged there.

In a survey published earlier this week and sponsored by security vendor Trend Micro, Osterman Research reported that only 60 percent of companies have deployed security tools specifically for SharePoint, while the other 40 percent are relying on traditional server and endpoint security applications. But founder and president Michael Osterman observes that SharePoint data tends to travel beyond these boundaries — SharePoint data is often shared across networks and applications, and sometimes even outside the company.

"Deploying antimalware software at the endpoint or on a server does not fully secure the SharePoint environment — the underlying database, Web pages, etc.," Osterman says.

Osterman's findings are supported by another study conducted by Courion, also a SharePoint security provider, back in September. In that study, Courion found that 25 percent of IT managers believed their SharePoint security was weak, or that they weren't sure and were worried about it. Nine percent of respondents said their organizations had suffered a breach that may have been attributable to a leak of sensitive data from SharePoint.

View Full Article | Return to Top

Military seeks better use of biometrics on U.S. bases

By Lolita C. Baldor, Associated Press January 28, 2009

WASHINGTON (AP) - On the front lines in Iraq, U.S. troops can scan someone's eye or finger to try to determine if he is a potential enemy or has been connected to a terror attack.

At military bases on U.S. soil, it's not that easy.

The use of biometrics — ranging from simple fingerprints to more advanced retinal and facial scans — has thrived in Iraq, where soldiers carry handheld devices that enable them to link to databases filled with hundreds of thousands of identities.

But in Colorado, military bases just 20 miles or so apart have different identification requirements and access procedures for personnel or contractors trying to get onto the property. The gaps raise security concerns and worries of another attempted massacre scheme, like the one foiled at Fort Dix in New Jersey in 2007.

"Interestingly, we are probably further forward in using biometrics outside our country in some of the combat environments than we are inside our country," Air Force Gen. Gene Renuart, commander of U.S. Northern Command, said Tuesday. "We've got to find a way to fix that."

Speaking at a biometrics conference, Renuart said the military services and law enforcement agencies around the country all carry different ID badges, and many are embedded with different information. And in some cases those agencies, he said, also have different computer databases that don't communicate well.

The more coordinated collection and use of biometrics, however, raises privacy concerns in the United States as well as in Iraq, where there are fears the information could be used for ethnic cleansing, to discern whether someone is a Sunni or a Shiite. And the presidential order for improved information sharing is also among the directives signed by President Bush that may be reviewed anew by the Obama administration.

Using biometrics on the battlefield took hold early in the Afghanistan war, when U.S. officials wanted to allow Afghan Muslim pilgrims to travel to the holy city of Mecca for the annual hajj. Officials reached for biometrics as a way to track who was traveling to Saudi Arabia, and to make sure those were the same people allowed back.
Renuart, the military commander in charge of domestic defense, said the thousands of Defense Department installations across the country — from sprawling bases to smaller compounds — have different access requirements and their plastic-encased ID cards may not look the same or tap into similar personal data.
Guards at the gates, he said, are also confronted with an array of contractors, vendors and representatives from other federal agencies — from the FBI to the Homeland Security Department — all with identification cards that could be faked or stolen.
"ID cards give you data, but they don't necessarily give you all the right data," said Renuart. "They don't give you access into a database that will allow you very quickly to discern whether this person is here legally or not" or if they are a criminal or someone who should not be allowed onto the base.

View Full Article | Return to Top

Tech Insight: How to Pick The Right Web Application Vulnerability Scanner

By Kelly Jackson Higgins
DarkReading

The mistake most people make when they first buy a Web application vulnerability scanner is to assume it's a simple point-and-click tool.

"It's not like network scanning where you go to an IP address and scan the network," says Danny Allan, director of security research for IBM Rational Software, which sells the AppScan vulnerability scanner. "This is not just a point-and-click product."

Web application vulnerability scanning — also known as "black box" testing (as opposed to source-code scanning, or white-box testing) — touches on various levels, transactions, and interactions associated with a Web application. And it requires an experienced hand to run it in order to get the most out of the process of detecting security flaws in Web applications, security experts say.

"The people who are running the scanner matter a lot more than the scanner itself. These are not simple hammers anyone can use. They require the operator to have a significant level of Web security knowledge," says Jeremiah Grossman, CTO of WhiteHat Security, a Web security services firm.

Another misconception about these devices is that the more vulnerabilities they find, the better they are. "Many people go by vuln counts in Web scanners, which is incorrect," notes Caleb Sima, CTO of the application security center at HP, which sells the WebInspect Web app scanner. That's because some products lump together multiple iterations of a specific vulnerability. If one scanner finds 12 SQL injection flaws, and another finds five, it doesn't mean the second one is necessarily missing more bugs, Sima says.

Consider, then, how a scanner counts vulnerabilities rather than how many times it finds SQL injection bugs. Eve more important, IBM's Allan says, is the underlying coding problem that caused the vulnerability. "You may have one cross-site scripting vulnerability and 80 different ways to exploit it," he says. "You need to focus not on the [vulnerability] issue, but on why it happened…that helps prevent security issues from happening again in the future."

View Full Article | Return to Top

Weizmann Institute Scientists Create Working Artificial Nerve Networks

Weizmann Institute of Science (01/28/09)

At the Weizmann Institute of Science, Physics of Complex Systems Department professor Elisha Moses and former research students Ofer Feinerman and Assaf Rotem have created logic gate circuits made from living nerve cells grown in a lab. The researchers say their work could lead to an interface that links the brain and artificial systems using nerve cells created for that purpose. The cells used in the circuits are brain nerve cells grown in culture. The researchers grew a model nerve network in a single direction by getting the neurons to grow along a groove etched in a glass plate. Nerve cells in the brain are connected to a vast number of other cells through axons, and must receive a minimum number of incoming signals before they relay the signal. The researchers found a threshold, about 100 axons, below which the chance of a response was questionable. The scientists then used two thin stripes of about 100 axons each to create an AND logic gate. "We have been able to enforce simplicity on an inherently complicated system. Now we can ask, 'What do nerve cells grown in culture require in order to be able to carry out complex calculations?' " Moses says. "As we find answers, we get closer to understanding the conditions needed for creating a synthetic, many-neuron 'thinking' apparatus."

View Full Article | Return to Top

Fight Brews Over How to Build a Better Internet

Christian Science Monitor (01/29/09) P. 2; Arnoldy, Ben

The U.S. Internet infrastructure is slower and more expensive than many other developed nations, and reaches fewer of its citizens. About 10 percent of U.S. households do not have access to a high-speed or broadband data connection, and only 3 percent have fiber-optic connections capable of delivering the high-speed data rates that analysts believe will become a necessity in the future. Meanwhile, countries such as Sweden are already adopting the next generation of Internet infrastructure. Part of the economic stimulus package currently being debated by the U.S. Congress includes $6 billion to upgrade America's Internet. Although experts disagree on how that money should be used, few question the importance of improving the Internet's infrastructure. Experts say that slow Internet access is holding back economic and scientific progress. The Communications Workers of America says that a $5 billion investment in broadband expansion would create 100,000 new jobs in telecommunications and information technology. Large companies favor government tax breaks to promote rapid Internet expansion, while Internet service providers in underserved areas say they need grants to pay for extending broadband to remote locations. Under the current House bill, all the spending will be allocated through project grants, instead of tax credits, which may not get used for another year. Rob Atkinson, president of the Information Technology and Innovation Foundation, says the government should provide both grants and tax breaks, and says much more funding is needed. "We'd have to invest $30 billion to get to the same level [of broadband penetration] as the Swedes," he says.

View Full Article | Return to Top

Fighting Malware: An Interview With Paul Ferguson

InfoWorld (01/23/09) Grimes, Roger A.

TrendMicro senior researcher Paul Ferguson says the sheer volume of malware today is incredible, and the real challenge is collecting data from as many points as possible and arranging the facts so that law enforcement can use that information as evidence. "The better job we can do collecting and normalizing the data up front, the easier it is to help law enforcement to get subpoenas and arrest warrants," Ferguson says. In Russia, Ukraine, and Eastern Europe, a few large organizations make the majority of the malware, though they pretend to be many small groups. Part of Ferguson's job involves correlating data to identify members of these groups through digital fingerprints. These groups generally use tried and true techniques. Their bots and worms are very similar and attacks often come from the same IP addresses, hosts, and DNS services. However, even these large groups use numerous freelance, low-level operators that provide specific skills. A major problem is that many of the larger players use policy holes to operate out in the open in countries like Russia where people such as Ferguson are powerless to stop them. Ferguson says much of the malware coming from China is actually from Russian groups that use the millions of unpatched PCs in China to launch attacks. He says most of the hacking in China, aside from the few professional criminal groups focusing on corporate espionage and the state-sponsored attacks on other governments, is actually social.

View Full Article | Return to Top

Is Technology Producing a Decline in Critical Thinking and Analysis?

UCLA News (01/27/09) Wolpert, Stuart

University of California, Los Angeles (UCLA) professor Patricia Greenfield says that critical thinking and analysis skills decline the more people use technology, while visual skills improve. Greenfield, the director of UCLA's Children's Digital Media Center, analyzed more than 50 studies on learning and technology. She found that reading for pleasure improves thinking skills and engages the imagination in ways that visual media cannot. She says the increased use of technology in education will make evaluation methods that include visual media a better test for what students actually know, and will create students that are better at processing information. However, she cautions that most visual media does not allocate time for reflection, analysis, or imagination. "Studies show that reading develops imagination, induction, reflection, and critical thinking, as well as vocabulary," Greenfield says. "Students today have more visual literacy and less print literacy." Greenfield also analyzed a study that found that college students who watched "CNN Headline News" without the news crawl on the bottom of the screen remembered more facts from the broadcast that those who watched with the crawl. She says this study and others like it demonstrate that multi-tasking prevents people from obtaining a deeper understanding of information.

View Full Article | Return to Top

Web 3.0 Emerging

Computer (01/09) Vol. 42, No. 1, P. 88; Hendler, Jim

Web 3.0 is generally defined as Semantic Web technologies that run or are embedded within large-scale Web applications, writes Jim Hendler, assistant dean for information technology at Rensselaer Polytechnic Institute. He points out that 2008 was a good year for Web 3.0, based on the healthy level of investment in Web 3.0 projects, the focus on Web 3.0 at various conferences and events, and the migration of new technologies from academia to startups. Hendler says the past year has seen a clarification of emerging Web 3.0 applications. "Key enablers are a maturing infrastructure for integrating Web data resources and the increased use of and support for the languages developed in the World Wide Web Consortium (W3C) Semantic Web Activity," he observes. The application of Web 3.0 technologies, in combination with the Web frameworks that run the Web 2.0 applications, are becoming the benchmark of the Web 3.0 generation, Hendler says. The Resource Description Framework (RDF) serves as the foundation of Web 3.0 applications, which links data from multiple Web sites or databases. Following the data's rendering in RDF, the development of multisite mashups is affected by the use of uniform resource identifiers (URIs) for blending and mapping data from different resources. Relationships between data in different applications or in different parts of the same application can be deduced through the RDF Schema and the Web Ontology Language, facilitating the linkage of different datasets via direct assertions. Hendler writes that a key dissimilarity between Web 3.0 technologies and artificial intelligence knowledge representation applications resides in the Web naming scheme supplied by URIs combined with the inferencing in Web 3.0 applications, which supports the generation of large graphs that can prop up large-scale Web applications.

View Full Article | Return to Top

SANS OUCH! Newsletter: Is My Computer Infected with a Virus? What Should I Do?

SANS OUCH! Volume 6,Number 2

not realize that you've introduced a virus until you notice something isn't quite right. Here are some signs that your computer might be infected:
• Your computer runs more slowly than normal
• Your computer stops responding or locks up often
• Your computer crashes and restarts every few minutes
• Your computer restarts on its own and then fails to run normally
• Applications on your computer don't work correctly
• Disks or disk drives are inaccessible
• You can’t print normally
• You see unusual error messages
• You see distorted menus and dialog boxes
These are common symptoms of infection—but they might also indicate hardware or software problems that have nothing to do with a virus.

Be smart!
• Do not ignore the symptoms. Write them down, especially the text of any unusual error messages.
• Look for a pattern, and make a note of it. For example, are all of your applications affected? Is the problem only with printing? When does your system crash?
• Contact your network administrator (computer help desk) or your Internet Service Provider, or call the technical support number provided by the manufacturer of your system.
• Answer the technician’s questions carefully, and describe the problem in as much detail possible. The more useful information you can provide, the quicker the problem will be resolved.
• The technician may advise you to stop using your computer. If so, follow that advice. Short-term inconvenience is better than losing all your data or having your identity stolen. More information: http://www.microsoft.com/protect/computer/viruses/indicators.mspx

View Full Article | Return to Top

GUIDE TO INFORMATION SECURITY TESTING AND ASSESSMENT

By Shirley Radack, Editor

Computer Security Division
Information Technology Laboratory
National Institute of Standards and Technology

A comprehensive approach to information security testing and assessment is essential to the secure operation of an organization’s information technology (IT) systems. By applying technical testing and examination techniques, organizations can identify and assess the vulnerabilities of their systems and networks, and then take steps to improve their overall security.

The Information Technology Laboratory of the National Institute of Standards and Technology (NIST) recently published a new guide to help organizations conduct their information security assessments. Issued in September 2008, the guide presents the key elements of security testing and assessments, explains the specific techniques that can be applied, and recommends effective methods for implementing testing and assessment practices.

NIST Special Publication (SP) 800-115, Technical Guide to Information Security Testing and Assessment: Recommendations of the National Institute of Standards and Technology

NIST SP 800-115, Technical Guide to Information Security Testing and Assessment, was written by Karen Scarfone and Murugiah Souppaya of NIST, and by Amanda Cody and Angela Orebaugh of Booz Allen Hamilton. The new guide replaces NIST SP 800-42, Guideline on Network Security Testing.

NIST SP 800-115 presents the basic technical aspects of conducting information security assessments. It discusses technical testing and examination methods that an organization might use as part of an assessment, and helps organizations to apply the techniques effectively to their systems and networks. The guide stresses the importance of organizational support to the technical assessment process through sound planning, careful analysis of findings, and regular reporting of results and recommendations to management officials.

View Full Article | Return to Top

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License