Volume 14, Issue 3

Sensors Help Keep the Elderly Safe, and at Home

New York Times (02/13/09) P. A1; Leland, John

Sensors and other monitoring technologies offer senior citizens more freedom to live independently and at less risk within the home. Motion sensors, medication reminder systems linked to mobile phones, pill compliance detectors, and wireless devices that transmit data on blood pressure and other physiological indicators are just some of the tools being used. These systems can be less costly than assisted living and nursing home care. One objective of personal health monitoring is to spur people to enhance their health by changing their behavior with the knowledge that they are being observed. However, the technologies are largely untested and are not usually covered by the government or private insurance plans. Moreover, there is the danger that the technologies could substitute for one-on-one interaction between seniors and their physicians, nurses, and relatives. "It's not that we need new technologies," says Dr. Jeffrey Kaye with the Oregon Health and Science University. "We need to use what we have more creatively." Monitoring technologies can gather terabytes of data, and researchers are working on ways of analyzing that information to help the well-being of users. For example, Kaye is working with Intel on a program that analyzes the motion data of seniors for patterns that would point to the onset of dementia well before it could be diagnosed with cognitive tests.

Full Article | Return to Top

Microsoft Announces $250,000 Conficker Worm Bounty

Network World (02/12/09) Messmer, Ellen

In an effort to stop the spread of the Conficker/Downadup worm, which is believed to have infected at least 10 million PCs around the world since November, Microsoft is offering a $250,000 reward for anyone who has information that leads to the arrest and conviction of those responsible for spreading the malicious code. In addition to offering the reward, Microsoft has partnered with security vendors, Internet registries and DNS providers such as ICANN, ORG, and NeuStar, to stop the Conficker worm from spreading further. Despite the efforts by Microsoft and others, the Conficker worm is set to wreak greater havoc on the world's PCs, security experts say. Experts say the worm connects to more than 250 command-and-control servers around the world every day as it awaits instructions on future downloads or actions. But the coalition formed by Microsoft is planning to take action to target the worm's update mechanism, including taking out the unique domain names for servers used for Conficker control, says Symantec's Gerry Egan. Microsoft says the coalition has already disabled a significant number of domains targeted by Conficker in an effort to disrupt the use of the worm and prevent attacks.

View Full Article | Return to Top

EU Spending on R&D Slips Further Behind US

EurActiv.com (02/10/09)

The European Union's (EU's) laggard pace behind the United States in terms of research and development (R&D) funding is increasing, according to a new study by European Investment Bank economist Kristian Uppenberg. The report concludes that the EU will probably come up short in reaching the Lisbon Strategy goal of committing 3 percent of gross domestic product to research by next year. "It appears that if Europe cannot close its R&D gap with the U.S. in services, the overall R&D gap is likely to widen rather than narrow as the share of services in total value added grows," Uppenberg warns. He contends that Europe's attempts to catch up with long-term rivals should not ignore large companies and innovative clusters, and points out that investments tend to yield better value for companies sited near other firms with high levels of R&D intensity. The ramifications are that diffusing investment across member states for the purpose of regional parity can prove profligate in terms of optimizing return on investment. Uppenberg's report cites Organization for Economic Cooperation and Development data to demonstrate that small- and medium-sized enterprises tend to account for a relatively small portion of total business R&D spending in nations with high aggregate R&D intensities such as the United States, Japan, Britain, Finland, Germany, and Sweden.

View Full Article | Return to Top

Sniffing Out Illicit BitTorrent Files

Technology Review (02/12/09) Graham-Rowe, Duncan

Illegal content transferred using the BitTorrent file-trading protocol can be detected and tracked though a new method that monitors networks without disrupting the data stream, according to its creators. When the tool spots an illicit file, it retains a record of the network addresses involved for analysis, says the Air Force Institute of Technology's Karl Schrader. Peer-to-peer transfers now account for the majority of Web traffic for many Internet service providers, which are generally only interested in this kind of traffic for the purpose of controlling or "throttling" it to liberate bandwidth for other uses. Schrader says this method does not reveal anything about the contents of each transfer, and while a small number of network-monitoring tools can identify specific BitTorrent files, it is generally a slow process. "Our system differs in that it is completely passive, meaning that it does not change any information entering or leaving a network," he says. The system first detects files that exhibit the signs of the BitTorrent protocol by analyzing the first 32 bits of the files' header data, and then examines the files' hash. If a hash matches any stored in a database of banned hashes, then the system will record the transfer and store the network addresses involved. The speediness of the method is partly explained by the presence of a specially configured field programmable gate array chip and a flash-memory card that stores a log of the illegal activity, allowing file contents to be scanned directly by tapping into an Ethernet controller buffer without interfering with network traffic. Schrader says the network monitoring cannot be detected by users.

View Full Article | Return to Top

Update on CCC Robotics

Computing Community Consortium (02/11/09) McCallum, Andrew

The Computing Community Consortium (CCC) initiative in robotics has finished its workshops and developed a roadmap, and will provide selected portions of the roadmap to the National Science Foundation, the National Institute of Standards and Technology, the Defense Advanced Research Projects Agency, the National Institutes of Health, and the Office of Science and Technology Policy. The CCC robotics initiative also is organizing a U.S. Congressional caucus on robotics that will take place in March. In early 2008, the initiative organized four workshops, with each one specializing in a specific area of robotics. The manufacturing and logistics workshop found that robotics has significant potential in processes such as logistics and material handling but that little attention has been given to these applications. The workshop concluded that new methods for easily programming robots are needed, as are ways to integrate sensory information to create safe robotic operations. The medical robotics conference determined that a wider adoption of healthcare robotics will require new methods in machine learning, human and robot interaction, and flexible mechanisms for physical interaction with humans. The service robotics conference emphasized that service robotics fall into categories—the professional and the domestic. The emerging technologies conference presented several opportunities that may arise as sensing becomes ubiquitous, more flexible mechanisms are designed, and new technologies such as nanotechnology become available. The use of machine learning, and new types of interfaces with high connectivity, could create entirely new opportunities for robotics.

View Full Article | Return to Top

Berkeley Releases Cloud Computing Study

HPC Wire (02/12/09)

University of California, Berkeley professors David Patterson and Armando Fox, researchers at Berkeley's Reliable Adaptive Distributed Systems Laboratory, have co-authored "Above the Clouds," a paper that analyzes the emerging cloud computing model. In this interview, the professors discuss the impact that cloud computing will have on high-performance computing (HPC). Fox says the presence of massive data centers comprised of tens of thousands of commodity computers is the single most important technological element supporting the viability of cloud computing. Patterson says this and other innovations are accompanied by a business model that delivers the semblance of unlimited computing resources available on demand, the removal of an up-front commitment by cloud users, and the ability to pay for using computing resources on a short-term basis as needed and to let them go when not needed. Fox acknowledges that the adoption of cloud computing could be potentially hindered by the uncertainty of having one's data and applications "locked in the cloud," and Patterson notes that dependence on a single cloud computing provider carries a risk to business continuity. Fox says the provision of standardized application programming interfaces with cross-vendor functionality would help solve two challenges to cloud computing—the prevention of data lock-in and the maintenance of high availability. "In general, the HPC community has not had to go through the process of re-architecting software that the Web community went through in the 90s," Fox says. "We think there are plenty of opportunities for innovation if HPC steps up to the plate, and an early demonstration would go a long way toward jump-starting that area." He says that imbuing software with horizontal scalability will help it exploit the cloud computing model, while Patterson recommends that hardware systems be designed at the scale of at least 12 racks, as that will be the minimum purchase size.

View Full Article | Return to Top

IBM Blue Gene Supercomputer Looks to Break the Petaflop Mark in Europe

eWeek (02/10/09) Ferguson, Scott

IBM and the German research center Forschungszentrum Juelich will build a Blue Gene/P supercomputer in Germany later this year that could set a new supercomputing record for Europe. The water-cooled supercomputer could become the first machine in Europe to break the petaflop barrier. Currently, the only two supercomputers capable of petaflop performance are in the United States. IBM's Roadrunner system, at the U.S. Department of Energy's (DOE's) Los Alamos National Laboratory in New Mexico, was the first to officially break the petaflop barrier. The second machine is the Cray XT Jaguar supercomputer at the DOE's Oak Ridge National Laboratory in Tennessee. IBM recently announced plans to build a supercomputer for the DOE dubbed Sequoia with a top speed of 20 petaflops. Sequoia is expected to go online in 2012. The Blue Gene family of supercomputers uses processors based on IBM's Power Architecture. The IBM supercomputer in Germany will contain 72 racks with 294,912 processors and will have 144 terabytes of memory and have 6 petabytes of hard disk storage.

View Full Article | Return to Top

Personalising the European Classroom

ICT Results (02/12/09)

The iClass learner-centric technology platform, which enables students to have more control over the learning process, is generating great interest across Europe. "IClass really is the largest Europe-wide R&D project of its kind focusing on primary and secondary education," says Ali Turker with SEBIT Education and Information Technologies. He says that information and communication technologies platforms can help embed personalization within formal education and bring structural value to informal learning. IClass offers a blend of formal and informal learning styles and environments that helps ready learners for independent inquiry. The iClass system conditions users to cover three cyclical stages of self-regulationplanning, learning, and reflectionvia a semiotic-based user interface and context-sensitive tips and advisories. The system enables students to tailor assignments to individual learning needs by adding or updating learning objectives and by managing task activities, while teachers can adjust certain parameters, add hints, and observe and comment on students' progress. IClass also provides a "personal space" to compile learning history and outcomes in order to give students an overview of their attitudes and accomplishments in tasks and goals. Users can tap a corpus of professionally developed content from SEBIT, user-generated content, and European resources.

View Full Article | Return to Top

Software Speeds Up Molecular Simulations

Stanford News (02/04/09) Ku, Joy P.; Bergeron, Louis

Stanford University's Open Molecular Mechanics (OpenMM) project has developed open source software that enables researchers to perform complex simulations of molecular motion on desktop computers far faster than previously possible. "Simulations that used to take three years can now be completed in a few days," says OpenMM project principal investigator and Stanford professor Vijay Pande. "With this first release of OpenMM, we focused on small molecular systems simulated and saw speedups of 100 times faster than before." OpenMM features a set of advanced hardware and software technologies that graphics processing units (GPUs), working with a computer's central processing unit, use to accelerate applications beyond just creating or manipulating graphics. OpenMM will enable molecular dynamics (MD) simulations to run on most high-end GPUs currently used in laptop and desktop computers and uses specially designed algorithms that allow molecular dynamics software to fully capitalize on the GPU architecture. "OpenMM will be a tool that unifies the MD community," says Russ Altman, chair of the Department of Bioengineering at Stanford. "Instead of difficult, disparate efforts to recode existing MD packages to enjoy the speedups provided by GPUs, OpenMM will bring GPUs to existing packages and allow researchers to focus on discovery."

View Full Article | Return to Top

Japanese Robot/Humanoid Innovations Update: Mankind's Best New Friend is Getting Better

PhysOrg.com (02/05/09) Simpson, Mary Anne

Japan's Information and Robot Technology Research Technology Initiative is working on the Home Assistant Robot Project, an effort to create a robot capable of serving as a housekeeper and caregiver. Such a robot would need fine motor skills, be able to balance on one foot, and lift objects. The project has created Assistant Robot, which features a wide-angle stereo camera, a telephoto stereo camera, and ultra-sensitive sensors. The robot operates on a two-wheel drive base with balancing wheels, and is capable of sweeping the floor, picking up a tray of dishes, moving the dishes to the sink, loading the dishwasher, moving chairs, and putting dirty clothes in the washer. Another research effort at Tokyo's Waseda University recently unveiled a humanoid robot named Twenty-One, which is equipped with manual dexterity capable of picking up a drinking straw, placing the straw in a tumbler, and handing the drink to a human. Twenty-One has voice recognition capabilities and three soft fingers with opposable thumbs. The researchers say Twenty-One is strong enough to support disabled patients to help them move around.

View Full Article | Return to Top

Professors Regard Online Instructions as Less Effective Than Classroom Learning

Chronicle of Higher Education (02/10/09) Shieh, David

Many education professionals believe that online courses are less effective than classroom courses, concludes two surveys conducted by the National Association of State Universities and Land-Grant Colleges. The surveys also found "widespread concern" that budget constrictions would hinder online learning programs. The surveys, which polled faculty members and administrators on their opinions of distance-learning programs, indicate that a majority of faculty members acknowledge that distance education offers students increased access and flexibility, but that developing and teaching online courses is difficult. Instructors are not rewarded financially or professionally for the extra time and effort they spend on online classes, and most feel that online education does not create better learning outcomes. Only 30 percent of the 10,000 faculty members surveyed said that online courses provided equal or superior learning outcomes compared to face-to-face classes, while 70 percent said that learning outcomes were inferior. Among faculty members who taught an online course, 48 percent said that online classes create inferior learning outcomes. A majority of faculty members said that institutions provide inadequate compensation for the additional responsibilities of teaching an online course, and many also said that students need more discipline to benefit from online education. Administrators emphasized the need for schools to incorporate online learning into their mission statements, create a single office to oversee online-learning programs, and foster institution-wide discussions on online learning.

View Full Article | Return to Top

Setting the Stage for Agile Development

SD Times (02/01/09) No. 215, P. 5; Feinman, Jeff

Corporate trainers are increasingly applying methods from acting, improvisation, and other art forms to agile development training so that software developers can be better prepared for changing requirements and other unanticipated events across the agile development cycle. The result is better teamwork among developers. Corporate trainer Matt Smith says that actors often must overcome their anxieties, a situation that parallels that of software developers. "If we're going to do Scrum and agile, we have to come to terms with that feeling and stop perceiving it as something to run from," he says. Certified Scrum trainer Stacia Broderick says the key to successful collaboration is a lack of inhibition, combined with frankness and a willingness to propose any concept. Trainers run agile workshops in which participants are asked to leave their comfort zone so they can learn how to deal with the unexpected when developing software. Smith notes that many developers, whose work is usually solitary, are unaccustomed to the social interaction agile development entails. Through improvisational training, developers can learn the value of giving up control by letting go of their agendas, judgment, control, and anticipation in order to advance projects through transparency and receptivity. Smith's exercises place people in non-work situations and encourage them to perform tasks that are more effective when carried out by individuals rather than teams.

View Full Article | Return to Top

How to Beat the Recession Using Underutilized Technology

Computer Weekly (02/10/09) Pincher, Michael

Experts say there are underutilized digital applications that could be tapped to pull the world out of the economic recession. Several categories of corporate innovationcustomer-oriented innovation, product innovation, process innovation, and strategic innovationare frequently overlooked. "Hype cycles," as described by Gartner, are cycles in technology innovation marked initially by over enthusiasm and disenchantment, and then followed by acceptance and productivity when the innovations pass beyond hype. Green information technology, social computing platforms, microblogging, cloud computing, and video telepresence are current technologies that are expected to have a transformative impact over the next several years. There also are various emerging technologies that could offer organizations insight into the marketplace's business and innovation potential. These technologies include enterprise portals, data management, content management, automotive electronics, consumer technologies, compliance technologies, human-computer interaction, enterprise speech technologies, XML technologies, nanotechnology, infrastructure protection, identity and access management, virtualization, information security, risk management, print management, networks and collaboration, wireless networking, Web technologies, and storage management.

View Full Article | Return to Top

Smart Companies Still Looking for Smart IT People

InformationWeek (02/10/09) McGee, Marianne Kolbasuk

Several information technology (IT) skills are still in high demand despite the economic downturn, reveals employment figures from the U.S. Bureau of Labor Statistics (BLS) and an IT pay trends study from Foote Partners. The BLS says the U.S. added 11,000 management and technical consulting service jobs in January, in addition to the 9,000 management and consulting service jobs that were added in October and November. Foote Partners CEO David Foote says that IT professionals with specific skill sets have been receiving pay increases and perks despite the recession. He says the pressure to decrease costs has generated new interest in automation, automation software, and business process improvements, improving pay for professionals with noncertified management, methodology, and process skills. IT professionals with related skills have seen average pay premiums rise 5.6 percent over the last three months, and professionals with those specific skills saw an average increase of 10.3 percent. Other highly desirable skills include IT architects and project managers with various certifications. "Architecture's key to running a good solid business, and there aren't enough architecture skills out there," Foote says. Database, security, and storage skills also are in demand, he says.

View Full Article | Return to Top

Road to Grid Computing Remains Difficult

Computerworld New Zealand (02/11/09) Bell, Stephen

Grid computing pioneer Ian Foster says that grid and cloud computing have not yet reached their full potential. Foster says the reliable infrastructure management that is needed for remote applications and data is still difficult to establish, creating an "energy barrier" between the providers of utility computing services and potential users. Nevertheless, he says progress is being made. "If you look at what people are doing with computing… they're not simply invoking programs that may run locally or remotely—they're often performing very complex activities that may involve data from one location, software from another," he says. Providing services capable of supporting such workloads in an on-demand way is a complex task that could involve multiple grids. For example, researchers could use the Social Informatics Grid, which lets researchers search a large resource of social sciences data, to find appropriate data and then analyze that data on the Open Science Grid, which provides tools and computing power. Foster says the goal of utility computing should be to provide computer power and access to data and software that can be expanded for unforeseen peaks in demand. Over the past few years, businesses have started to reduce the energy barrier by providing simple services that meet the demands of large groups of people. These software-as-a-service efforts are the best-known use of cloud computing, but reliable, available infrastructure also could be provided as a service mode or platform, Foster says.

View Full Article | Return to Top

Lack of Diversity Part of Equation in STEM Fields

Pittsburgh Post-Gazette (02/10/09) Chute, Eleanor

Tonya Groover, a computer science Master's student at the University of Pittsburgh, founded and directs the Technology Leadership Institute at Pitt, which features a six-week academic-enrichment program for high school students designed to teach technical skills. The institute's purpose is to get more women and underrepresented minorities interested in science, technology, engineering, and mathematics (STEM). Many colleges and universities have similar programs that include tutoring, mentoring, and research opportunities. Blacks account for about 15 percent of the population between 20 and 24 years old, but receive only about 8 percent of STEM degrees, and Hispanics have a similar percentage, according to the National Science Foundation (NSF). The number of STEM bachelor's degrees is about evenly split between men and women, but women earn more bachelor's degrees overall, which means a smaller percentage of the total degrees awarded to women are in STEM fields. NSF also notes that computer and information sciences are less popular among women, black, and Hispanic students. Men accounted for three-fourths of the 2005 graduates nationwide. Approximately 55 percent of graduates were white, 12 percent Asian, 10 percent black, and 6 percent Hispanic. "Women are attracted to areas or fields [in which] they feel they're making a difference," Groover says. "Computer science is not presented in such a manner. Computer science is a field that will make a difference in the lives of hungry people or sick people."

View Full Article | Return to Top

Cognitive Computing Project Aims to Reverse-Engineer the Mind

Wired News (02/06/09) Ganapati, Priya

IBM Almaden Research Center cognitive computing project manager Dharmendra Modha has a plan to engineer the mind by reverse-engineering the brain. Neuroscientists, computer engineers, and psychologists are working together to create a new computing architecture that simulates the brain's perception, interaction, and cognitive abilities. The researchers hope to first simulate a human brain on a supercomputer, and then use new nano-materials to create logic gates and transistor-based equivalents of neurons and synapses to build a hardware-based, brain-like system. The effort has received a $5 million grant from the Defense Advanced Research Projects Agency, which is enough to run the first phase of the project. The researchers say if the project is successful it could lead to a new computing system within the next decade. "The idea is to do software simulations and build hardware chips that would be based on what we know about how the brain and how neural circuits work," say University of California-Merced professor and project participant Christopher Kello. The researchers started by building a real-time simulation of a small cerebral cortex, which has the same structure in all mammals. The simulation required 8 terabytes of memory on an IBM BlueGene/L supercomputer. Modha says the simulation, although not complete, offered insights into the brain's high-level computational principles. A human cerebral cortex is about 400 times larger than the small mammal simulation and would require a supercomputer with a memory capacity of 3.2 petabytes and a computational capacity of 36.8 petaflops. While waiting for supercomputing technology to improve, the researchers are working on implementing neural architectures in silicon.

View Full Article | Return to Top

Unnatural Selection: Robots Start to Evolve

New Scientist (02/04/09) No. 2694, P. 20; Marks, Paul

Existing robots typically require a complete overhaul of their control software in order to adjust to physical alternations. Artificial intelligence expert Christopher MacLeod and colleagues at England's Robert Gordon University sought a way around this problem by developing a robot that mimics biological evolution. MacLeod's robot brain adapts to physical changes through the assignment of new "neurons" via an incremental evolutionary algorithm (IEA). The IEA is programmed to freeze the neural network it has evolved when it realizes that its evolutions are no longer enhancing the robot's execution of its primary command. After physical modifications are made and the IEA notes that these changes are inhibiting the robot from fulfilling its primary command, it automatically adds new neurons to form another neural network programmed to cope with the changes, freezing it when its performance reaches its upper limit. This process is repeated for every new component or function added to the robot. The University of Reading's Kevin Warwick is skeptical of MacLeod's method, arguing that "[MacLeod's] approach will result in many more neurons being needed to do the job badly, when a smaller number of neurons would have done well." However, Macleod is confident that his approach will lead to more advanced robots. He says the software "can build layer-upon-layer of complexity to fulfill tasks in an open-ended way."

View Full Article | Return to Top

Wireless at WARP Speed

Rice University (01/29/09) Boyd, Jade

Ashutosh Sabharwal, director of Rice University's Center for Multimedia Communication (CMC), has developed a wireless open-access research platform (WARP) that enables scientists to conduct wireless research without having to build test platforms. The turnkey, open-source platform includes a collection of circuit boards and the transmitters and devices needed for high-end wireless communications. Sabharwal says that WARP's flexibility makes it particularly effective. When researchers need to test a variety of radio transmitters, wireless routers, and network access points, they can write a few software programs that allow WARP to serve as each of those devices. Sabharwal says Motorola is using WARP to test an entirely new, low-cost architecture for wireless Internet access in rural India, which he says is the type of research that would have too low a profit margin to be pursued without WARP. NASA is using WARP to find ways of reducing the weight, cost, and complexity in the wiring systems in spacecraft. CMC's Patrick Murphy is working with graduate students to use WARP in proof-of-concept technologies for "cognitive wireless," which is based on the fact that half of the nation's wireless spectrum is unused at any given time. Sabharwal says researchers are pursuing smart, "cognitive" networks that can change frequencies on the fly to open up unused spectrum for customer use.

View Full Article | Return to Top

Prepare for Truly Mobile Wi-Fi

Discover (01/28/09) Cass, Stephen

Microsoft Research's Vehicle Wi-Fi (Vi-Fi) project is working to enhance mobile Wi-Fi's effectiveness by modifying the software rules that determine how Wi-Fi works. Some wireless technologies are available that allow for seamless Internet connections when traveling at high speeds, but these technologies are either more expensive than Wi-Fi or transmit at slow speeds. Vi-Fi is based on the fact that in many places Wi-Fi is so ubiquitous that users are within range of multiple base stations at any given time. By connecting to multiple base stations simultaneously, users can request data through one base station and have it sent through another base station when they move out of range of the first station. The technology uses standard Wi-Fi hardware, which allows it to be easily and inexpensively deployed in urban areas or on highways. Microsoft is testing the technology on its campus by connecting two shuttle buses to the Internet through 11 base stations.

Full Article | Return to Top

P2P Networks Rife With Sensitive Health Care Data, Researcher Warns

Computerworld (01/30/09) Vijayan, Jaikumar

Sensitive medical data is easily available through peer-to-peer (P2P) file-sharing networks, reveals a study by researchers at Dartmouth College. During the study, the researchers used search terms related to the top 10 publicly traded U.S. healthcare organizations to see if they could find medical data on P2P networks such as Gnutella, FastTrack, Aries, and e-Donkey. Dartmouth professor Eric Johnson says the searches yielded a plethora of information from healthcare companies, suppliers, and patients. For example, Johnson says he was able to find a 1,718-page document containing Social Security numbers, dates of birth, insurance information, treatment codes, and other sensitive data belonging to roughly 9,000 patients at a medical testing laboratory. Johnson and the other researchers were able to obtain the information because employees at healthcare providers installed P2P networks on their computers, which allow users to download and share music and videos from shared folders but also can allow users to obtain other types of files if care is not taken to control which folders users have access to. Johnson says the study underscores the need for hospitals and other healthcare providers to be aware of the dangers of inadvertent data leakage as well as the need to put improved controls in place to monitor, detect, and stop them.

View Full Article | Return to Top

What a mess! Experts ponder space junk problem

By VERONIKA OLEKSYN, Associated Press Writer Veronika Oleksyn, Associated Press Writer

VIENNA – Think of it as a galactic garbage dump. With a recent satellite collision still fresh on minds, participants at a meeting in the Austrian capital this week are discussing ways to deal with space debris — junk that is clogging up the orbit around the Earth.

Some suggest a cosmic cleanup is the way to go. Others say time, energy and funds are better spent on minimizing the likelihood of future crashes by improving information sharing.

The informal discussions on the sidelines of a meeting of the United Nations Committee on the Peaceful Uses of Outer Space, which began Feb. 9 and ends Friday, arose from concern about the collision of a derelict Russian spacecraft and a working U.S. Iridium commercial satellite.

The Feb. 10 incident, which is still under investigation, generated space junk that could circle the Earth and threaten other satellites for the next 10,000 years; it added to the already worrying amount of debris surrounding the planet.

Nicholas L. Johnson, NASA's chief scientist for orbital debris, said about 19,000 objects are present in the low and high orbit around the Earth — including about 900 satellites, but much of it is just plain junk.

He estimated that included in the 19,000 count are about a thousand objects larger than 10 centimeters (4 inches) that were created by last week's satellite collision, in addition to many smaller ones. He predicted that if more junk accumulates, the likelihood of similar collisions — currently very rare — will increase by 2050.

To Johnson, the "true solution" in the long run is to go get the junk — or push it away to a higher altitude before it has time to crash into anything.

"Today's environment is all right but the environment is going to get worse, therefore I need to start thinking about the future and how can I clean up sometime in the future," he said.

View Full Article | Return to Top

Scientists claim big leap in nanoscale storage Technology could fit contents of 250 DVDs on surface the size of a quarter

By Stephen Lawson

February 19, 2009 (IDG News Service) Nanotechnology researchers say they have achieved a breakthrough that could fit the contents of 250 DVDs on a coin-size surface and might also have implications for displays and solar cells.

The scientists — from the University of California, Berkeley, and the University of Massachusetts, Amherst — discovered a way to make certain kinds of molecules line up in perfect arrays over relatively large areas. The results of their work will appear today in the journal Science, according to a UC Berkeley statement. One of the researchers said the technology might be commercialized in less than 10 years, if industry is motivated.

More densely packed molecules could mean more data packed into a given space, higher-definition screens and more efficient photovoltaic cells, according to scientists Thomas Russell and Ting Xu. This could transform the microelectronics and storage industries, they said. Russell is director of the Materials Research Science and Engineering Center at UMass and a visiting professor at Berkeley, and Xu is a UC Berkeley assistant professor in chemistry and materials sciences and engineering.

Russell and Xu discovered a new way to create block copolymers, or chemically dissimilar polymer chains that join together by themselves. Polymer chains can join up in a precise pattern equidistant from each other, but research over the past 10 years has found that the patterns break up as scientists try to make the pattern cover a larger area.

Russell and Xu used commercially available, man-made sapphire crystals to guide the polymer chains into precise patterns. Heating the crystals to between 1,300 and 1,500 degrees Celsius (2,372 to 2,732 degrees Fahrenheit) creates a pattern of sawtooth ridges that they used to guide the assembly of the block copolymers. With this technique, the only limit to the size of an array of block copolymers is the size of the sapphire, Xu said.

View Full Article | Return to Top

NIST to update guidelines for testing PIV card apps, middleware

By William Jackson, Feb 18, 2009

Draft revision of special publication reflects changes made to PIV specifications
The National Institute of Standards and Technology is revising guidelines for compliance testing of personal identity verification (PIV) applications and middleware to reflect changes in the specifications for PIV cards and access control systems.

NIST has released a draft of Special Publication 800-85A-1, “PIV Card Application and Middleware Interface Test Guidelines,” for public comment. The revisions include additional tests necessary to evaluate some of the optional features added to the PIV data model and card interface and PIV middleware as specified in SP 800-73-2, “Interfaces for Personal Identity Verification.”

View Full Article | Return to Top

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License