Gizmorama - December 30, 2013
Good Morning, When I read this - "Computers doing "approximate computing" for tasks not requiring perfect accuracy could double efficiency and reduce energy consumption..." - all I could think about was that if things weren't bad enough now our computers are getting lazy.
Learn about these interesting stories from the scientific community in today's issue.
Until Next Time,
Erin ****-- 'Approximate' computers could do tasks not requiring exact answers --*WEST LAFAYETTE, Ind. - Computers doing "approximate computing" for tasks not requiring perfect accuracy could double efficiency and reduce energy consumption, U.S. scientists say. Such computers could run a growing number of applications designed to tolerate "noisy" real-world inputs and use statistical or probabilistic types of computations, computer engineers at Purdue University said. "The need for approximate computing is driven by two factors: a fundamental shift in the nature of computing workloads, and the need for new sources of efficiency," said researcher Anand Raghunathan. "Computers were first designed to be precise calculators that solved problems where they were expected to produce an exact numerical value," he said. "However, the demand for computing today is driven by very different applications." Current computers are designed to compute precise results even when it is not necessary, the researchers said, whereas approximate computing could endow computers with a capability similar to the human brain's ability to scale the degree of accuracy needed for a given task. "If I asked you to divide 500 by 21 and I asked you whether the answer is greater than one, you would say yes right away," Raghunathan said. "You are doing division but not to the full accuracy. If I asked you whether it is greater than 30, you would probably take a little longer, but if I ask you if it's greater than 23, you might have to think even harder. "The application context dictates different levels of effort, and humans are capable of this scalable approach, but computer software and hardware are not like that." Purdue researchers said they're working on a range of hardware techniques to demonstrate approximate computing, showing a potential for improvements in energy efficiency.
*-- Fossil is earliest evidence of human hand evolution for tool use --*FAYETTEVILLE, Ark. - A hand bone from an early human ancestor found in East Africa shows the earliest evidence of a structural feature related to tool use, anthropologists say. Dated to 1.42 million years old, the bone from the early hominin Australopithecus anamensis suggests a distinctive feature of modern hands evolved more than a half million years earlier than previously thought, scientists at the University of Arkansas, Fayetteville, said Monday. "Modern human hands are specialized to hold tools, but hand bones are difficult to find, and we haven't known when modern human hands developed," anthropology Professor J. Michael Plavcan said. "With this discovery, we have the earliest evidence of the structural changes of the hand that are associated with tool use." The third metacarpal bone, discovered in Kenya, displayed a styloid process, a curved projection at the end of the bone important to a hand that uses tools with both dexterity and precision, the researcher said. While stone tools date back at least 2.58 million years, until this discovery the earliest evidence of structural characteristics related to tool use dated back just 800,000 years, they said. "There's still a huge gap in our understanding of the evolution of the hand," Plavcan said. "We need to find even earlier bones to determine just when structural features of the hand appeared."
***Missed an Issue? Visit the Gizmorama Archives