The scientists say it is for people with diseases. But of course the gamers will eventually be the biggest users. Of course, if virtual game playing is an addiction the scientists are right anyhow.
PHILADELPHIA - A team of researchers co-led by the University of Pennsylvania has developed and tested a new high-resolution, ultra-thin device capable of recording brain activity from the cortical surface without having to use penetrating electrodes. The device could make possible a whole new generation of brain-computer interfaces for treating neurological and psychiatric illness and research. The work was published in Nature Neuroscience.
When will the first person get their skull sawed open to implant an interface to play video games faster? Which country will they have to travel to in order to get this surgery done? Seriously. Also, will humans ever drive cars with implanted interfaces? Or will cars be shifted to pure robotic operation before that happens?
There is of course scientific value to a higher resolution way to measure brain activity.
"The new technology we have created can conform to the brain's unique geometry, and records and maps activity at resolutions that have not been possible before," says Brian Litt, MD, the study's senior author and Associate Professor of Neurology at the Perelman School of Medicine and Bioengineering at the University of Pennsylvania. "Using this device, we can explore the brain networks underlying normal function and disease with much more precision, and its likely to change our understanding of memory, vision, hearing and many other normal functions and diseases." For our patients, implantable brain devices could be inserted in less invasive operations and, by mapping circuits involved in epilepsy, paralysis, depression and other 'network brain disorders' in sufficient detail, this could allow us to intervene to make patients better, Litt said.
Want a brain implant that distracts you from negative thoughts? No, I'm not talking about emotional depression. I'm talking about the sorts of resentful negative thoughts people develop when the financial system nearly crashes. An implant could help you become comfortably numb.
A 360 channel array. No doubt future models will come with even higher capacity.
Composed of 720 silicon nanomembrane transistors in a multiplexed 360-channel array, the newly designed ultrathin, flexible, foldable device can be positioned not only on the brain surface but also inside sulci and fissures or even between the cortical hemispheres, areas that are physically inaccessible to conventional rigid electrode arrays. Current arrays also require separate wires for each individual sensor, meaning that they can sample broad regions of the brain with low resolution or small regions with high resolution, but not both. The multiplexed nanosensors of the new device can cover a much large brain area with high resolution, while using almost ten times fewer wires.
Scientists try to surpass the human eye with a nickel-sized camera that includes zoom. Zoom would be incredibly handy.
Researchers from Northwestern University and the University of Illinois at Urbana-Champaign are the first to develop a curvilinear camera, much like the human eye, with the significant feature of a zoom capability, unlike the human eye.
The "eyeball camera" has a 3.5x optical zoom, takes sharp images, is inexpensive to make and is only the size of a nickel. (A higher zoom is possible with the technology.)
While the camera won't be appearing at Best Buy any time soon, the tunable camera -- once optimized -- should be useful in many applications, including night-vision surveillance, robotic vision, endoscopic imaging and consumer electronics.
If you could get a bionic eye that was better than your natural eye would you pop for it? Imagine the ability to shift into infrared or ultraviolet as needed. A bionic eye would no doubt include the ability to take and store pictures too, all under thought control.
Beyond the human eye.
"We were inspired by the human eye, but we wanted to go beyond the human eye," said Yonggang Huang, Joseph Cummings Professor of Civil and Environmental Engineering and Mechanical Engineering at Northwestern's McCormick School of Engineering and Applied Science. "Our goal was to develop something simple that can zoom and capture good images, and we've achieved that."
With a bionic eye properly networked you could also watch movies by just thinking yourself into movie-watching mode.
In an interview with The Atlantic the CEO of Google reveals he does not want to implant a Google interface into people's brains. Luddite.
The end of the interview turned to the future of technology. When Bennet asked about the possibility of a Google "implant," Schmidt invoked what the company calls the "creepy line."
"Google policy is to get right up to the creepy line and not cross it," he said. Google implants, he added, probably crosses that line.
This brings back memories of the 1967 classic The President's Analyst starring James Coburn where the "Phone Company" kidnaps him to try to convince him to support the implantation of a mini-phone into the brain of all American citizens. Here is a clip from the movie about the Cerebrum Communicator. I can't understand his character's skepticism. Today such a device would include web search capability integrated with blog post capability. I could write my blog with my eyes closed.
According to Kohno and his colleagues, who published their concerns July 1 in Neurosurgical Focus, most current devices carry few security risks. But as neural engineering becomes more complex and more widespread, the potential for security breaches will mushroom.
For example, the next generation of implantable devices to control prosthetic limbs will likely include wireless controls that allow physicians to remotely adjust settings on the machine. If neural engineers don’t build in security features such as encryption and access control, an attacker could hijack the device and take over the robotic limb.
The Manchurian Candidate could be remote controlled.
Imagine implants that interface the brain to artificial eyes. Hack into them and you could potentially create images of things that are not really there.
So far one's brain is one's private preserve. This won't always be the case.
A wearable patch that you replace once a week monitors heart rate, respiration, body temperature, and other indicators to calculate your calorie consumption and burning rates. Then a cell phone or PC can get the information via Bluetooth and advise you about whether you need to eat less or exercise more.
The calorie monitor, which is being developed by biotech incubator PhiloMetron, uses a combination of sensors, electrodes, and accelerometers that--together with a unique algorithm--measure the number of calories eaten, the number of calories burned, and the net gain or loss over a 24-hour period. The patch sends this data via a Bluetooth wireless connection to a dieter's cell phone, where an application tracks the totals and provides support. "You missed your goal for today, but you can make it up tomorrow by taking a 15-minute walk or having a salad for dinner," it might suggest.
I have no idea how well this generation of device works. But it is a step in an inevitable direction. We will wear external sensors as patches and as sensor nets built into clothing, jewelry, and watches. Diabetics and heart patients can benefit from real time warnings. Athletes can get warnings of overheating and dehydration.
The monitoring systems will eventually get integrated with embedded drug releasing systems that will act much like endocrine organs adjusting our metabolism when it gets out of desirable operating ranges.
Traditionally, stimulating nerves or brain tissue involves cumbersome wiring and a sharp metal electrode. But a team of researchers at Case Western Reserve University is going "wireless."
And it's a unique collaboration between chemists and neuroscientists that led to the discovery of a remarkable new way to use light to activate brain circuits with nanoparticles.
Ben Strowbridge, an associate professor in the neurosciences department in the Case Western Reserve School of Medicine and Clemens Burda, an associate professor in chemistry, say it's rare in science that people from very different fields get together and do something that is both useful and that no one had thought of before. But that is exactly what they've done.
But hey, it uses photovoltaic nanoparticles. At least we'd become environmentally sustainable robots.
By using semiconductor nanoparticles as tiny solar cells, the scientists can excite neurons in single cells or groups of cells with infrared light. This eliminates the need for the complex wiring by embedding the light-activated nanoparticles directly into the tissue. This method allows for a more controlled reaction and closely replicates the sophisticated focal patterns created by natural stimuli.
The electrodes used in previous nerve stimulations don't accurately recreate spatial patterns created by the stimuli and also have potential damaging side effects.
Nanoparticles embedded in tissue would be hard to detect. So a secret agent could get turned into an enemy by some complex layout of embedded nanoparticles.
Their goal is to use it to get around nerve damage. Imagine wireless communication to an embedded device that then shines infrared light on neurons to activate them. One could control nerves in extremities cut off by spinal damage. Or transmit sensory data from extremities to the brain.
"The long-term goal of this work is to develop a light-activated brain-machine interface that restores function following nerve or brain impairments," Strowbridge says. "The first attempts to interface computers with brain circuitry are being done now with complex metal electrode stimulation arrays that are not well suited to recreating normal brain activity patterns and also can cause significant damage."
Powerful neuro-tools for medicine become powerful neuro-tools for other purposes as well.
I can also imagine reasons why a person would want to hand over control of part of their nervous system to an external force. Someone on a diet could program their house computer to disable them from opening the refrigerator or food cabinets between meals. Any time you tried the house computer could flash you with a pattern that rendered your arm immobile. Or, hey, exercise without having to think about it. Get a computer to exercise your body while you watch a movie.
Researchers in a study funded by the National Institutes of Health (NIH) have demonstrated for the first time that a direct artificial connection from the brain to muscles can restore voluntary movement in monkeys whose arms have been temporarily anesthetized. The results may have promising implications for the quarter of a million Americans affected by spinal cord injuries and thousands of others with paralyzing neurological diseases, although clinical applications are years away.
This isn't just useful for people who have spinal cord injuries. Direct neural interfaces that can control one's own muscles could also control heavy equipment such as airplanes. Also, the ability to control one's own muscles via an artificial route that bypasses spinal nerve pathways could allow one to control one's extremities much more quickly. The wave of depolarization that transmits a pulse down a nerve's membranes is pretty slow compared to the speed of electrons in a wire.
"This study demonstrates a novel approach to restoring movement through neuroprosthetic devices, one that would link a person's brain to the activation of individual muscles in a paralyzed limb to produce natural control and movements," said Joseph Pancrazio, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke (NINDS).
The research was conducted by Eberhard E. Fetz, Ph.D., professor of physiology and biophysics at the University of Washington in Seattle and an NINDS Javits awardee; Chet T. Moritz, Ph.D., a post-doctoral fellow funded by NINDS; and Steve I. Perlmutter, Ph.D., research associate professor. The results appear in the online Oct. 15 issue of Nature. The study was performed at the Washington National Primate Research Center, which is funded by NIH's National Center for Research Resources.
In the study, the researchers trained monkeys to control the activity of single nerve cells in the motor cortex, an area of the brain that controls voluntary movements. Neuronal activity was detected using a type of brain-computer interface. In this case, electrodes implanted in the motor cortex were connected via external circuitry to a computer. The neural activity led to movements of a cursor, as monkeys played a target practice game.
Of course, if electrodes get implanted into a person's muscles this creates the possibility of remote control of a person's muscles. A guy could get kidnapped and given secret surgery to implant a radio receiver and wiring to some of his peripheral muscles. The first time he finds out about what his kidnappers did is when he grabs a gun from a security agent and finds himself powerless to stop from shooting a top political leader. Other possibilities come to mind with husbands who get tired of hearing their wives gossip.
After each monkey mastered control of the cursor, the researchers temporarily paralyzed the monkey's wrist muscles using a local anesthetic to block nerve conduction. Next, the researchers converted the activity in the monkey's brain to electrical stimulation delivered to the paralyzed wrist muscles. The monkeys continued to play the target practice game—only now cursor movements were driven by actual wrist movements—demonstrating that they had regained the ability to control the otherwise paralyzed wrist.
Picture mini-electrodes in your brain tied to a transmitter. You could send messages to many devices in your environment including a garage door, a car ignition, or the thermostat in a house. I expect human-machine interfaces will become far more powerful in the future.
The German company Rodenstock has developed what I've always wanted: sunglasses that show you some of your vital signs. But the sunglasses don't yet report your bionic legs moving you at 60 miles an hour.
Sunglasses that can show athletes' performance and heart rate data in their peripheral vision have been developed by a German company.
The sunglasses – dubbed "Informance" – display a stopwatch and heart rate at one edge. The extra components needed to do this add just 7 grams to the glasses' overall weight – which is much less than previous head-up displays.
This is just the start of a trend toward far greater data collection and display power in glasses. Imagine embedded sensors that would report blood chemistry vitals to the glasses by very low power radio signals. Watches will also collect information about blood chemistry and will have sensors that measure chemicals in your sweat.
Life sometimes imitates cartoons and TV shows. Shades of the Bionic Woman and the Six Million Dollar Man:
Goldfarb decided on the miniaturized rocket propellant approach because batteries can't provide enough power to make strong prosthetic arms.
Combine a mechanical arm with a miniature rocket motor: The result is a prosthetic device that is the closest thing yet to a bionic arm.
A prototype of this radical design has been successfully developed and tested by a team of mechanical engineers at Vanderbilt University as part of a $30 million federal program to develop advanced prosthetic devices.
“Our design does not have superhuman strength or capability, but it is closer in terms of function and power to a human arm than any previous prosthetic device that is self-powered and weighs about the same as a natural arm,” says Michael Goldfarb, the professor of mechanical engineering who is leading the effort.
The prototype can lift (curl) about 20 to 25 pounds – three to four times more than current commercial arms – and can do so three to four times faster. “That means it has about 10 times as much power as other arms despite the fact that the design hasn’t been optimized yet for strength or power,” he says.
At a certain point, the weight of the batteries required to provide the energy to operate the arm for a reasonable period becomes a problem. It was the poor power-to-weight ratio of the batteries that drove Goldfarb to look for alternatives in 2000 while he was working on a previous exoskeleton project for DARPA. He decided to miniaturize the monopropellant rocket motor system that is used by the space shuttle for maneuvering in orbit. His adaptation impressed the Johns Hopkins researchers, so they offered him $2.7 million in research funding to apply this approach to the development of a prosthetic arm.
Goldfarb’s power source is about the size of a pencil and contains a special catalyst that causes hydrogen peroxide to burn. When hydrogen peroxide burns, it produces pure steam. The steam is used to open and close a series of valves. The valves are connected to the spring-loaded joints by belts made of a special monofilament used in appliance handles and aircraft parts. A small sealed canister of hydrogen peroxide that easily fits in the upper arm can provide enough energy to power the device for 18 hours of normal activity.
I'm expecting nanotubes designed to work more like muscles will eventually displace rockets and electric motors from arms. Though a rocket arm could some day support a rocket punch that would make a rocket-propelled fist and forearm pretty attractive to some.
A new study from Imperial College London shows that robot assisted knee surgery is significantly more accurate than conventional surgery.
The team of surgeons tested whether Acrobot, a robotic assistant, could improve surgical outcomes for patients undergoing partial knee replacement. Acrobot works by helping the surgeon to line up the replacement knee parts with the existing bones.
The surgeons looked at 27 patients undergoing unicompartmental knee replacement. The patients were separated into two groups as part of a randomised controlled trial, with 14 having conventional surgery, and the remaining 13 having robot assisted surgery.
Although the operations took a few minutes longer using the robotic assistant, the replacement knee parts were more accurately lined up than in conventional surgery. All of the robotically assisted operations lined up the bones to within two degrees of the planned position, but only 40 percent of the conventionally performed cases achieved this level of accuracy.
The team found there were no additional side effects from using robot assisted surgery, and recovery from surgery was quicker in most cases.
The quicker recovery is a reflection of more accurate work. Since robots can improve upon what humans can do robotics could greatly improve surgical outcomes by reducing error rates and reducing collateral damage to other tissue.
Yes, these robots are not designed to take over the operation entirely. But the big cost savings will come from totally robotic surgery.
Professor Justin Cobb Opens in new window, from Imperial College London, who led the research team, said: "These robots are designed to hold the surgeon's hand in the operating theatre, not take over the operation. This study shows they can be an enormous help, preventing surgeons from making mistakes. More importantly, by showing how the increased accuracy makes a difference to how well a knee works after surgery, we will be able to develop a new generation of less invasive procedures without the risks of error, providing faster recovery and better functional outcomes for patients."
The study involved both surgeons and engineers from Imperial College, with medical robotics engineers designing the Acrobot prototype, and surgeons testing it.
Professor Cobb added: "This study could have important implications for not just surgery, but also for health economics. By improving the accuracy of surgery, and ultimately improving the outcome for patients, we can make sure the knee replacements work better and last longer, preventing the need for additional surgery."
The study was funded by The Acrobot Co. Ltd. a spin out of Imperial College London.
The big cost savings would come from speeding up surgery and reducing human involvement.
Some day genetically engineered pigs will produce organs that will be robotically removed from pigs and robotically transplanted into humans. Also, automated systems will grow up and transfer stem cells into people as part of rejuvenation therapies.
I would prefer development of better medical therapies such as gene therapies and cell therapies that reduce the need for surgery in the first place. But surgery will be necessary for a long time to come and automation of surgery could save trillions of dollars.
Some people are getting Radio Frequency Identification Devices (RFIDs) implanted into themselves so that they never have to worry about forgetting their car key or house key.
Cyborgs have stepped out of science fiction and into real life with a small but growing group of tech aficionados who are getting tiny computer chips implanted into their bodies to do everything from opening doors to unlocking computer programs.
Amal Graafstra and his girlfriend Jennifer Tomblin never have to worry about forgetting the keys to her Vancouver home or locking themselves out of Graafstra's Volkswagen GT.
They can simply walk up to the door and, with a wave of a hand, the lock will open. Ditto for the computer. No more struggling to remember complicated passwords and no more lost keys.
Click through on the article to see a picture of a hand with the chip implant pushed near the surface of the skin. Same picture here.
Graafstra divides his time between Bellingham Washington and Vancouver BC. But he got the RFID implant done in Los Angeles.
The implantable tags cost only a couple of dollars. But the surgeon's fee was probably in the hundreds of dollars. Still, you could implant it yourself (perhaps team with a friend and do implants on each other) if you wanted to do the research on what tools were needed and buy them and sterilize them. Graafstra has a forthcoming book RFID Toys that'll teach you some of what you need to know to take your first step toward cyborgism.
A small but substantial segment of the populace will probably go for implantable RFID once a critical mass of both doctors to install it and technicians to upgrade houses and cars are available. Someone with a more technical bent can get into it now as Graafstra did. But you have to get the RFID reader and get it installed into your house door and car door in both cases with electric power attached to do the door unlocking.
But this idea seems a bit problematic. When you are leaving the house do you just keep one hand away from the door to prevent yourself from unlocking it? Also, do you have to worry when driving that your hand might unlock the door if you go to buckle your seat belt in a dangerous neighborhood?
The biological fuel cell uses glucose, a sugar in blood, with a non-toxic substance used to draw electrons from glucose, said the team led by Matsuhiko Nishizawa, bio-engineering professor at the graduate school of state-run Tohoku University.
The fuel cell has the size of a small coin and can generate 0.2 milliwatts of electric power.
While 0.2 milliwatts is not much power it is enough to operate, say, a blood glucose sensor or other very small implanted sensor. Given the expected future availability of implantable sensors based on nanotechnology a small power source to operate them would be valuable.
"We set out to create an exoskeleton that combines a human control system with robotic muscle," said Homayoon Kazerooni, professor of mechanical engineering and director of UC Berkeley's Robotics and Human Engineering Laboratory. "We've designed this system to be ergonomic, highly maneuverable and technically robust so the wearer can walk, squat, bend and swing from side to side without noticeable reductions in agility. The human pilot can also step over and under obstructions while carrying equipment and supplies."
The Berkeley Lower Extremity Exoskeleton (BLEEX), as it's officially called, consists of mechanical metal leg braces that are connected rigidly to the user at the feet, and, in order to prevent abrasion, more compliantly elsewhere. The device includes a power unit and a backpack-like frame used to carry a large load.
Such a machine could become an invaluable tool for anyone who needs to travel long distances by foot with a heavy load. The exoskeleton could eventually be used by army medics to carry injured soldiers off a battlefield, firefighters to haul their gear up dozens of flights of stairs to put out a high-rise blaze, or rescue workers to bring in food and first-aid supplies to areas where vehicles cannot enter.
"The fundamental technology developed here can also be developed to help people with limited muscle ability to walk optimally," said Kazerooni.
The researchers point out that the human pilot does not need a joystick, button or special keyboard to "drive" the device. Rather, the machine is designed so that the pilot becomes an integral part of the exoskeleton, thus requiring no special training to use it. In the UC Berkeley experiments, the human pilot moved about a room wearing the 100-pound exoskeleton and a 70-pound backpack while feeling as if he were lugging a mere 5 pounds.
The project, funded by the Defense Advanced Research Projects Agency, or DARPA, began in earnest in 2000. Next week, from March 9 through 11, Kazerooni and his research team will showcase their project at the DARPA Technical Symposium in Anaheim, Calif.
For the current model, the user steps into a pair of modified Army boots that are then attached to the exoskeleton. A pair of metal legs frames the outside of a person's legs to facilitate ease of movement. The wearer then dons the exoskeleton's vest that is attached to the backpack frame and engine. If the machine runs out of fuel, the exoskeleton legs can be easily removed so that the device converts to a large backpack.
More than 40 sensors and hydraulic actuators form a local area network (LAN) for the exoskeleton and function much like a human nervous system. The sensors, including some that are embedded within the shoe pads, are constantly providing the central computer brain information so that it can adjust the load based upon what the human is doing. When it is turned on, the exoskeleton is constantly calculating what it needs to do to distribute the weight so little to no load is imposed on the wearer.
One significant challenge for the researchers was to design a fuel-based power source and actuation system that would provide the energy needed for a long mission. The UC Berkeley researchers are using an engine that delivers hydraulic power for locomotion and electrical power for the computer. The engine provides the requisite energy needed to power the exoskeleton while affording the ease of refueling in the field.
The current prototype allows a person to travel over flat terrain and slopes, but work on the exoskeleton is ongoing, with the focus turning to miniaturization of its components. The UC Berkeley engineers are also developing a quieter, more powerful engine, and a faster, more intelligent controller, that will enable the exoskeleton to carry loads up to 120 pounds within the next six months. In addition, the researchers are studying what it takes to enable pilots to run and jump with the exoskeleton legs.
Check out a 1 megabyte jpeg picture of a human wearing a BLEEX exoskeleton and see more images at the Berkeley Lower Extremity Exoskeleton (BLEEX) project page.
The BLEEX brings to mind the powerloader exoskeleton that Sigourney Weaver as Ripley wore to battle the alien queen in the climactic fight scene in the 1986 move Aliens. As images here, here, and here demonstrate, its larger size and huge grappler hands made it look far more industrial and powerful than BLEEX. But a real life implementation of Ripley's industrial power loader would weigh too much and give its users too large a profile to be useful in most battlefield applications.
Researchers in the Sandia National Laboratories Advanced Concepts Group are using computers hooked up to a variety of medical monitoring devices to measure brain and body changes of people in meetings and feeing back this information to meeting members in order to cause them to change the way they are responding.
Aided by tiny sensors and transmitters called a PAL (Personal Assistance Link) your machine (with your permission) will become an anthroscope - an investigator of your up-to-the-moment vital signs, says Sandia project manager Peter Merkle. It will monitor your perspiration and heartbeat, read your facial expressions and head motions, analyze your voice tones, and correlate these to keep you informed with a running account of how you are feeling - something you may be ignoring - instead of waiting passively for your factual questions. It also will transmit this information to others in your group so that everyone can work together more effectively.
"We're observing humans by using a lot of bandwidth across a broad spectrum of human activity," says Merkle, who uses a Tom Clancy-based computer game played jointly by four to six participants to develop a baseline understanding of human response under stress.
"If someone's really excited during the game and that's correlated with poor performance, the machine might tell him to slow down via a pop-up message," says Merkle. "On the other hand, it might tell the team leader, 'Take Bill out of loop, we don't want him monitoring the space shuttle today. He's had too much coffee and too little sleep. Sally, though, is giving off the right signals to do a great job.'"
The idea of the devices has occasioned some merry feedback, as from a corporate executive who emailed, "Where do we get the version that tells people they are boring in meetings? Please hurry and send that system to us. A truck full or two should cover us."
More seriously, preliminary results on five people interacting in 12 sessions beginning Aug. 18 indicate that personal sensor readings caused lower arousal states, improved teamwork and better leadership in longer collaborations. A lowered arousal state - the amount of energy put into being aware - is preferable in dealing competently with continuing threat.
"Some people think you have to start with a theory. Darwin didn't go with a theory. He went where his subjects were and started taking notes. Same here," he says. Merkle presented a paper on his group's work at the NASA Human Performance conference Oct. 28-29 in Houston. "Before we knew that deep-ocean hydrothermal vents existed, we had complex theories about what governed the chemistry of the oceans. They were wrong." Now it's state-of-the-art to use EEG systems to link up brain events to social interactions, he says. "Let's get the data and find out what's real."
The tools for such a project - accelerometers to measure motion, face-recognition software, EMGs to measure muscle activity, EKGs to measure heart beat, blood volume pulse oximetry to measure oxygen saturation, a Pneumotrace(tm) respiration monitor to measure breathing depth and rapidity - are all off-the-shelf items.
Further work is anticipated in joint projects between Sandia and the University of New Mexico, and also with Caltech.
"In 2004 we intend to integrate simultaneous four-person 128-channel EEG recording," says Merkle, "correlating brain events, physiologic dynamics, and social phenomena to develop assistive methods to improve group and individual performance."
How many potential abuses of this technology can you imagine? One of the worse I can think of would be bosses using it to ensure that everyone is paying attention in a seminar introducing the latest management fad.
On the more optimistic side, if the device could measure confusion then it would help to alert someone that his explanation about some problem or proposal is not getting across. Also each person could use it to detect their own anger and frustration and work to try to reduce the stress that one feels in some situations. This would have the beneficial effect of slowing down the rate of aging. The ability to do biofeedback training while in silly meetings would also have the benefit of making those meetings more productive - at least for one's own personal quest to develop thought patterns that allow one to remain unstressed by the folly which is found in so many corporate settings.
A “Batcane,” developed by Sound Foresight Ltd. and Cambridge Consultants Ltd., directly mimics bats' echolocation by emitting ultrasonic pulses of sound (beyond the reach of human hearing) and analyzing the echoes that bounce back from nearby objects.
The cane navigates by bouncing ultrasonic signals off objects that lie in its path and feeding the information back to the user. This makes it possible to avoid obstacles with confidence - even obstacles at head height. No other primary aid can do this effectively. The batcane is now being developed for manufacture and will be launched at the start of 2004.
It also picks up the reflections of these waves to map obstacles up to three metres away in three dimensions. Buttons on the cane's handle vibrate gently to warn a user to dodge low ceilings and sidestep objects blocking their path.
But will this remain a technology only for blind people? Think about it. Bats use sonar. Their sonar transmitters and receivers must be very small because, well, bats are very small. Wouldn't it be handy to have built-in sonar perhaps located the back of your ears and under your chin? If it could be made to blend in it might not affect your appearance. You'd get a warning of you were about to, say, bump your head or trip over a chair in the dark. Sonar could warn you if you are about to walk off a steep cliff or become entangled in brush.
Then there is the comic book and movie superhero angle. Batman should certainly have at least a Batcane for sneeking up on bad guys at night His Batcar should have sonar as well. Infrared isn't adequate for cold objects. But if he's going to be a superhero then he ought to have genetic engineering to give him supersenses befitting his exalted role.
A 13 year old girl in Wimbledon England named Kat Reid has become the first person to receive an artificial bone replacement that can be extended in length without surgery as she grows.
In November, orthopaedic surgeons Steve Cannon and Tim Briggs fitted the world's first bionic bone into the thigh of 13-year-old Kat Reid.
The extendable prosthesis, to give it its proper name, is a major breakthrough because it allows Kat's left leg to grow in pace with her right, without her having to undergo further surgery.
Application of an external magnetic field causes the implanted prosthetic part to grow by 1 millimeter at a time.
While the article does not say why Kat Reid needed a prosthetic replacement for her thigh bone many children who must loose a piece of bone as part of a cancer treatment or who were born with a bone defect or who suffered severe damage to a bone in an accident would benefit from an artificial replacement that can be extended without repeated surgeries.
Israeli scientist Amir Karniel discusses the human future as robo sapiens.
"In another 50 years, new creatures, a new species of humans will live among us. It is entirely possible that in the future they will make up the majority of humanity. They will be known as robo-sapiens." This declaration did not come from science fiction master Isaac Asimov, but rather from Dr. Amir Karniel, an expert in electrical engineering from the Technion, a researcher in the area of "motor control," which aims to discover how the brain controls the movement of the body, sends instructions and receives feedback from the nervous system.
One needn't replace an entire extremity with an electro-mechanical device to become, in effect, a robo sapiens. As Karniel points out, we've already started in that direction with such implanted devices as pacemakers. There are also lots of ways to enhance a human body with sensors what would leave all the extremities and muscles intact.
Picture youself having nanotech light sensors embedded into your retinas and even on the outer layers of your eyes that let you shift into infrared or perhaps to enhance your sight in low light conditions. Another useful enhancemenet would be a far more sophisticated replacement for hearing aids made by replacing the cilia in the ears with nanotubes and nanofibers that could hear much lower sound levels. The nanotube cilia could even be shifted into configurations which allow one to hear high frequency sounds that are outside of normal human hearing range.
Another appealing enhancement is Robert Freitas's proposal for respirocytes to replace red blood cells with a much greater capacity to store oxygen. These devices would have such a larger capacity to carry oxygen that they would reduce the risk of drowning or dying from smoke inhalation in a fire or dying from lack of oxygen in other ways. Of course they would enhance athletic performance as well.
Damage to the hippocampus at the base of the brain can leave a person unable to form new memories. One solution to the problem that is nearing testing is to build a chip that performs all the functions of the hippocampus.
The world's first brain prosthesis - an artificial hippocampus - is about to be tested in California. Unlike devices like cochlear implants, which merely stimulate brain activity, this silicon chip implant will perform the same processes as the damaged part of the brain it is replacing.
The prosthesis will first be tested on tissue from rats' brains, and then on live animals. If all goes well, it will then be tested as a way to help people who have suffered brain damage due to stroke, epilepsy or Alzheimer's disease.
A team led by Theodore W. Berger of USC spent 10 years to build a mathematical model of the hippocampus and then to program it into a silicon chip.
Slices of rat hippocampus were stimulated with electrical signals millions of times, until scientists could be sure which input produced a corresponding output.
Putting the information from each slide together, the researchers were able to devise a mathematical model of a whole hippocampus.
The model was then programmed on to a chip.
From the University of Southern California web site of team leader Theodore W. Berger:
The research of Dr. T.W. Berger involves the complementary use of experimental and theoretical approaches to developing biologically constrained mathematical models of mammalian neural systems. The focus of the majority of current research is the hippocampus, a neural system essential for learning and memory functions. The goal of this research is to address three general issues: (1) the relation between cellular/molecular processes, systems-level functions, and learned behavior; (2) the extent of which the functional dynamics of neural systems are altered by activity-dependent synaptic plasticity; (3) the extent to which the essential functions of a neural system can be incorporated within a hardware representation (e.g., VLSI circuitry).
Experimental studies involve the use of extracellular, intracellular, and whole-cell electrophysiological recording techniques, applied in vivo using anesthetized and chronically implanted animals, and in vitro using hippocampal slice preparations. A number of neurobiological issues are being investigated, including: (1) quantifying the signal processing capabilities of hippocampal neurons and the extent to which these capabilities reflect regulation due to feedforward and feedback circuitry vs. intrinsic neuronal mechanisms, such as voltage-dependent conductances or second messenger biochemical systems; (2) the spatio-temporal distribution of activity in neural networks and its dependence on input pattern and network connectivity; (3) the cellular mechanisms underlying changes in the strength of connections among neurons, i.e., synaptic plasticity, and the influence of synaptic plasticity on signal processing characteristics of neurons and the spatio-temporal distributions of activity in networks.
These and other experimental studies are used in conjunction with several different theoretical approaches to develop models of: (1) the nonlinear, input/output properties of single hippocampal neurons and circuits composed of several populations of hippocampal neurons (in collaboration with Dr. V. Marmarelis, Biomedical Engineering, USC), (2) the hierarchical relationship between synaptic and neuronal events (in collaboration with Dr. G. Chauvet, Institute for Theoretical Biology, University of Angers, France), (3) the kinetic properties of glutamatergic receptor subtypes, and (4) adaptive properties expressed by the "hippocampal-like" neural networks implemented with analog VLSI technology (in collaboration with Dr. B. Sheu, Electrical Engineering, USC).
Suppose the initial tests on rats are successful and the group wants to move onto trying it in humans. There seems to be a problem with how to get patient consent. People who can't form new long term memories may be unable to have the treatment explained to them well enough to be able to evaluate the risks and potential benefits.
Still, this is weird wild stuff. If a chip can be made to emulate the hippocampus can the chip's algorithms be improved upon to make it better than the hippocampus? Could it be turned up to stimulate learning when one is studying material one needs to remember?
Steve Potter of Georgia Institute of Technology has built a hybrid rat neuron robot called a hybrot.
In his experiment, Potter places a droplet of solution containing thousands of rat neuron cells onto a silicon chip that’s embedded with 60 electrodes connected to an amplifier. The electrical signals that the cells fire at one another are picked up by the electrodes which then send the amplified signal into a computer. The computer, in turn, wirelessly relays the data to the robot.
Endless science fiction parallels come to mind. How about the Star Trek original series episode where Spock's brain was stolen in order to use it run a planet? Rolf Pfeifer of the University of Zurich, Switzerland foresees the use of neurons to make self-healing computer systems. Neuronal stem cells could be induced to form new connections to repair damage. Picture hybrot battlebots that would be silicon-biological hybrids that would be extremely difficult to kill.
If human neurons were used to make hybrots then how many neurons would it have to have before we'd hear demands for the recognition of hybrot rights? But if the hybrots were designed to desire to kill a large portion of humanity how could they be granted rights?
Existing computational paradigms do not approach the capabilities of even simple organisms in terms of adaptability and real-time control. There are computational mechanisms and network architectures in living neural systems that are missing from even the most sophisticated artificial computing systems. This project consists of the development of computational systems that incorporate both living neuronal networks and artificial elements, including robotic testbeds and signal-processing circuitry. These hybrid neuronal-robotic systems (‘Hybrots’) will provide a platform for discovering, exploring, and using the computational dynamics of living neuronal networks to perform real-time tasks in the physical world. Cultured networks of molluscan and mammalian neurons will be interfaced to robotic systems via multi-electrode array substrates capable of distributed, spatio-temporal stimulation and recording of neural activity. Unlike brains in animals, in vitro networks are amenable to detailed observation and manipulation of every cell in the network. Both high-speed optical recording, and time-lapse microscopy will be employed. By embodying the networks with actuators and sensors, the dynamical attractor landscape of neuronal networks will be studied under the conditions for which they evolved: continuous real-time feedback for adaptive behavioral contr
"We call it the 'Hybrot' because it is a hybrid of living and robotic components," he said. "We hope to learn how living neural networks may be applied to the artificial computing systems of tomorrow. We also hope that our findings may help cases in which learning, memory, and information processing go awry in humans."
The team uses networks of cultured rodent brain cells as the Hybrot's brain, and has essentially given the cultured neural networks a body in the form of a mobile robot. Potter's group hopes the research will lead to advanced computer systems that could some day assist in situations where humans have lost motor control, memory or information processing abilities. The neural interfacing techniques they are developing could be used with prosthetic limbs directly controlled by the brain. Advances in neural control and information processing theory could have application, for example, in cars that drive themselves or new types of computing architectures.
Inside Potter's lab, a droplet containing a few thousand living neurons from rat cortex is placed on a special glass petri dish instrumented with an array of 60 micro-electrodes. The neurons are kept alive in an incubator for up to two years using a new sealed-dish culture system that Potter developed and patented. The neural activity recorded by the electrodes is transmitted to the robot, the Khepera, made by K-Team S.A, which serves as the body of the cultured networks. It moves under the command of neural activity that is being transmitted to it, and information from the robot's sensors is sent back to the cultured net in the form of electrical stimuli.
Central to the experiments is Potter's belief that over time, the team will be able to establish a living network system that learns like the human brain.
A press release from Argonne National Laboratory reports on an attempt to make an embeddable replacement artificial retina.
LOS ANGELES, Nov. 26, 2002 – Secretary of Energy Spencer Abraham toured the University of Southern California's ophthalmology laboratories at the Doheny Eye Institute and heard from the national research team that hopes to restore vision to millions of people with blindness caused by retinal disorders. As a result of recent breakthroughs in science and engineering technology, Abraham announced that DOE will commit $9 million over three years to augment artificial retina research, including support for a laboratory within the Doheny Eye Institute on the USC campus.
The DOE national labs, partnering with the University of Southern California and North Carolina State University, are designing a micro-electronic device that would be implanted in the eye on the surface of the retina. A microelectrode array would perform the function of normal photoreceptive cells.
You can find more about this recent announcement in this UPI article.
Optobionics cofounders Vincent Chow and Dr. Alan Chow have invented and already surgically implanted their Artificial Silicon Retina in some human test subjects. The seeing abilty it provides is still pretty crude. But its impressive there are actually people walking around using their device. From an ABC News report in May 2002 the results for some test subjects are described:
Two years ago, Chow put an artificial retina into Bennett's right eye. Before surgery she couldn't see a thing, but when the bandages came off, she was shocked.
She can now see light and shadow, which means she can slowly find her way around. Bennett still cannot see shapes. But is it better than what she had before?
"Oh Lord, yes, yes," Bennett said.
John Crocker, another of Chow's patients, was blind for more than 50 years. But right after his operation, he got a huge surprise.
"I was walking through the house," Crocker said. "And I stopped and I looked and I could see the lights on our Christmas tree, which is the first time that's ever happened for a long time."
However, the Optobionics implant is providing more benefit than expected because its providing a source of stimulation that is somehow improving the functioning of the real retina:
How the Optobionics ASR works:
What Dr. Chow found is that the chips also seem to be stimulating remaining healthy cells.
"We're pretty excited. We initially expected only some light perception where the implant was. What seems to be improvement outside the areas was unexpected," he said.
He said the device is having a "rescue effect" on the retina, restoring cells located near the implant site.
"What we think is happening is the implant is stimulating other cells around the retina. We're finding vision is improving not just where the implant is but also in areas near the implant," he said.
The ASR™ microchip is a silicon chip 2mm in diameter and 25 microns thick, less than the thickness of a human hair. It contains approximately 5,000 microscopic solar cells called “microphotodiodes,” each with its own stimulating electrode. These microphotodiodes are designed to convert the light energy from images into electricalchemical impulses that stimulate the remaining functional cells of the retina in patients with AMD and RP types of conditions.
The ASR microchip is powered solely by incident light and does not require the use of external wires or batteries. When surgically implanted under the retina—in a location known as the “subretinal space”—the ASR chip is designed to produce visual signals similar to those produced by the photoreceptor layer. From their subretinal location, these artificial “photoelectric” signals from the ASR microchip are in a position to induce biological visual signals in the remaining functional retinal cells which may be processed and sent via the optic nerve to the brain.
Click thru to the previous link to see a picture of the ASR on a penny. Its quite small.
The Blindness Foundation is supporting a number of other groups which are also working on artificial retina development:
Several other research groups are working to develop an artificial retina. The Foundation currently supports two groups: Dr. Eugene de Juan and Mark Humayun of The Foundation’s Research Center at Johns Hopkins University, and Drs. Joseph Rizzo and John Wyatt, of Harvard Medical School and Massachusetts Institute of Technology, respectively. The Foundation also supports Dr. Richard Normann at the University of Utah, who is developing a silicon chip to be implanted in the visual cortex of the brain.
Obviously the first users for this sort of technology will be blind people. It will be a wonderful boon that will restore eyesight for millions of people. But what comes next? Once the resolution of an artificial retina can exceed that of a human eye (and that is a matter of when, not if) and it becomes possible to combine it with an artificial iris that has zoom capablity then suddenly artificial eye implants will become attractive for people with perfectly healthy eyes. If the future artificial retinas can be made from thin films that can shift their molecular configurations on-the-fly then it ought to be possible to even reconfigure (perhaps by straining eye muscles in some trained pattern) the retinas to look at different parts of the light spectrum as well. Imagine, for instance, soldiers or police shifting their eyesight into the infrared when on a dangerous nighttime operation. Or imagine just any person wanting to up their light sensitivity when outside at night or in a room with little available light.
Sufficiently advanced technologies developed to treat disease conditions will inevitably morph into technologies that will enhance function. Research on artificial implants for blindness is laying the groundwork for the eventual development of vastly superior artificially enhanced eyesight.
Update: Here are some details on the role that Lawrence Livermore National Laboratory plays in the DOE artificial retina projectp
Lawrence Livermore National Laboratory engineers are developing a microelectrode array for a multi-laboratory DOE project to construct an artificial retina or "epiretinal prosthesis."
LLNL's polymer-based microelectrode array.
The three-year DOE project brings together national labs, universities and a private company, with Oak Ridge serving as the lead laboratory.
An epiretinal prosthesis could restore vision to millions of people suffering from eye diseases such as retinitis pigmentosa, macular degeneration or those who are legally blind due to the loss of photoreceptor function. In many cases, the neural cells to which the photoreceptors are connected remain functional.
Project leader Dr. Mark Humayun, of the University of Southern California, has shown that electrical stimulation of the viable retinal cells can result in visual perception. These findings have sparked a worldwide effort to develop a retinal prosthesis device.
Expertise in biomedical microsystems at Lawrence Livermore's Center for Microtechnology is being tapped to develop a "flexible microelectrode array," able to conform to the curved shape of the retina, without damaging the delicate retinal tissue, and to integrate electronics developed by North Carolina State University. The device will serve as the interface between an electronic imaging system and the human eye, directly stimulating neurons via thin film conducting traces and electroplated electrodes.
"We're very excited to be a part of this collaboration," said Peter Krulevitch of the Lab's Center for Microtechnology and leader of the team developing the flexible microelectrode array. Other LLNL team members include LLNL employee and UC Davis graduate student Mariam Maghribi, fabrication technician Julie Hamilton, participating guest Dennis Polla, undergraduate summer student Armando Tovar from Trinity University, MIT graduate student Christina Park, engineer Courtney Davidson and scientist Tom Wilson.
Lab engineers have pioneered the use of poly(dimethylsiloxane), a form of silicone rubber simply called PDMS, in fabricating hybrid integrated microsystems for biomedical applications. In particular, the Lab has worked on "metalization" -- applying metals for electronics and electrodes to PDMS for implant devices.
"It's our important contribution to this project," Krulevitch said. "We've developed a technique for fabricating metal lines that can be stretched. This is really critical for a flexible device designed to conform to the shape of the retina."
The electronic array must be robust enough to withstand damage from the implant procedure and be biocompatible -- able to withstand the physiological conditions in the eye. Another reason for using PDMS is that silicone rubber is not only flexible, but is a promising material from a biocompatibility standpoint.
Humayun's group implanted three first-generation LLNL devices in a dog's eye to identify needed design and fabrication improvements. Livermore engineers are now working on a second-generation microelectrode array with smaller electrodes in greater numbers, and developing techniques to integrate the electrodes with electronics chips. The array's perimeter -- 4 mm across -- has been reinforced with micromolded ribs to facilitate handling and prevent curling or folding. The current version of the array is longer for short-term implant experiments. But the final device for implant will measure 4 mm by 4 mm.
Applications for the flexible electrode array go beyond the retinal prosthesis, according to Krulevitch, who says it has the potential to allow development of next-generation medical implant devices such as the "cochlear implant" for hearing. The technology could one day be used for "deep brain stimulation devices" for treating such diseases as Parkinson's, and spinal cord stimulation devices for treatment of chronic pain.###
Partners in the project include Oak Ridge, Argonne, Sandia, Los Alamos, USC Doheny Eye Institute and North Carolina State University.
For more information on the overall DOE Artificial Retina project, check the Web at: http://www.energy.gov/HQPress/releases02/novpr/pr02248.htm
Founded in 1952, Lawrence Livermore National Laboratory has a mission to ensure national security and to apply science and technology to the important issues of our time. Lawrence Livermore National Laboratory is managed by the University of California for the U.S. Department of Energy's National Nuclear Security Administration.
Laboratory news releases and photos are also available electronically on the World Wide Web of the Internet at URL http://www.llnl.gov/PAO and on UC Newswire.
"Speak the speech, I pray you, as I pronounced it to you, trippingly on the tongue; but if you mouth it, as many of your players do, I had as lief the town-crier spoke my lines." How about tongue braille as a novel way to read Hamlet?
The tongue, asserts Paul Bach-y-Rita, is a terrific portal to the brain. The UW-Madison physician and inventor says the tongue might serve as the ideal tactile environment to help blind people navigate, give Navy Seals directions in dim underwater environments and guide urban search-and-rescue teams as they comb the confusion of smoke-filled buildings for people to rescue.
"You don't see with your eyes, you see with your brain," says Bach-y-Rita, who, with colleague Kurt Kaczmarek, has applied for a patent on a device that uses electrical impulses to route spatial information through the tongue to the brain.
"The brain is very malleable," says Bach-y-Rita. "You can compensate for sensory loss by rehabilitating the brain" and turning to surviving sensory systems such as the skin and the tongue to substitute for lost vision.
Loaded with nerves and bathed in its own conductive saline solution, the tongue is an ideal surface for a tiny array of 144 electrodes that can, through the coordinated firing of mild electrical impulses, route images from a camera, computer or other device straight to the brain.
New miniaturized electronics, say Bach-y-Rita and Kaczmarek, will permit the device to be as small or smaller than a dental retainer and enable it to be built directly into the respirators used by divers and firefighters.
A related, tongue-based application is being developed by UW-Madison researcher Mitch Tyler to help people who have lost their sense of balance.
The technology has even caught the attention of some in the video gaming industry who see it as a bold new frontier for controlling the action of electronic gaming.
In addition to systems for the blind, Bach-y-Rita says the technology could have other applications, because designers can create impulses from any measurable source.
He is in discussions with the military regarding devices to allow divers to "see" more effectively through murky water using their mouthpieces, or to allow soldiers to receive night vision readouts through their tongues. He adds that the tongue sensors could one day be used in conjunction with video games, and his team has received a federal grant for a system that will aid people who've lost their sense of balance.
Researchers at the Naval Aerospace Medical Research Laboratory and the Institute for Human and Machine Cognition used Bach-y-Rita’s ideas to cram a pilot’s brain with expanded spatial awareness akin to sight. Instead of electrodes on the tongue, the Tactile Situation Awareness System uses a flight suit embedded with as many as 96 transducers – mini-vibrators like the ones found in cell phones. The TSAS makes pilots less dependent on their eyes. "The visual workload has gone up so high that we’re seeing an increase in the number of human factor-related mishaps," says Anil Raj, who heads the program at the University of West Florida. Now pilots can gauge their orientation from a buzz on the torso. If the plane banks left, they feel a zap on the left. If the plane makes a 180-degree turn, the zap will travel from one side of the body to the other. It usually takes months of training before pilots can look at their altimeters, attitude indicators, and compasses and understand a plane’s location in space. With TSAS, it takes 10 minutes.
Sounds like this project is still at a fairly early stage of development. Still, what is interesting is that this sort of technology is under active development in the first place. Electronics technology has advanced far enough that embeddable sensor systems can be moved into active development.
Using a tiny wireless sensor developed at Oak Ridge National Laboratory, doctors will know in minutes instead of hours if an organ is getting adequate blood flow after transplant or reconstructive surgery.
Conventional methods for assessing circulation involve invasive procedures or extensive laboratory testing. In some cases, by the time doctors realize there isn't adequate blood flow to an organ or tissue, irreversible damage already has occurred.
"Our goal is to offer a technique that provides the physician with a very early indication of whether the surgery is successful," said Nance Ericson, who leads the effort from ORNL's Engineering Science and Technology Division. Ericson is working with Mark Wilson, a surgeon at the University of Pittsburgh, and Gerard Coté of Texas A&M University.
The tiny implantable sensor – about the diameter of a quarter -- and micro-instrumentation being developed by Ericson would provide real-time information by transmitting data to a nearby receiver. Specifically, the unit employs optical sensors to assess tissue circulation. Preliminary tests using laboratory rats have provided encouraging results.
Embedded optical sensors could do other kinds of measurement as well:
Although not a part of this project, Ericson sees this leading to several other photonics-based microsensors for making measurements in a number of areas. For example, this approach could be useful for measuring arterial blood gases, which are primary indicators of respiratory function, or serum lactate, which is a marker for the severity of tissue injury. Current methods require obtaining blood samples and then sending those samples to a lab for analysis.
An article in The Globe And Mail written by Shafiq Qaadri surveys the progress in developing a large number of kinds of artificial implants for the human body:
But the next level of integration is bionics (bio-mechanics), in which the body talks to the machine, actually giving the artificial part its cue to function. Dextra is a prosthetic hand, which was developed at Rutgers University, and is one of the first artificial limbs to use a person's own nerves to feed electricity to the machine's fingers.
"Communication is key," says Dr. William Craelius, the biomedical engineer who developed Dextra. "Human-machine communication could soon lose its distinction as the No. 1 obstacle to bionics." With a seamless human-to-device connection, Dextra patients have such natural control that they can type and play the piano.
With the development of synthetic muscle, entire joints need not be replaced, but select muscles can be restored.
So far, scientists at the Artificial Muscle Research Institute at the University of New Mexico hope to help people who have lost muscle function. But as the technology progresses, researchers could also reinforce existing muscles, perhaps inserting muscles into new locations, leading to entirely new movements and power.
The idea here is not just to have better control of the rate of deliver. A lot of large biological macromolecules can not be taken orally. In order to avoid the need for injection this approach puts the drug delivery system inside the body:
A small onboard processor, packaged with the drug-holding chip, choreographs how each of the 400 wells opens at precisely the right moment over a period of, say, six months. In the case of the MicroCHIPS prototype, the processor is an off-the-shelf model similar to those that power handheld calculators. The whole drug-delivery system finishes up as an implantable device no bigger than a cardiac pacemaker. After being implanted, blood vessels grow around the chip, allowing the medication to diffuse straight into the capillary network.
More advanced models will sense drug levels and will use a radio transmitter to return the information to a receiving device outside the body.
UK Reading University professor Kevin Warwick foresees a day when brain implants send signals between humans rendering spoken speech obsolete:
His ideas get weirder. Warwick looks forward to the day when implants might allow the body’s functions like heart rate, blood pressure and temperature to be monitored in real time. His most bizarre vision: the world of 2050 dominated by cyborgs, their brains all linked to a global network, sharing access to a common super-intelligence. Network police could be summoned at the mere thought of crime.
This reminds me of the 1967 paranoid classic movie The President's Analyst which has a great part where Coburn's character is kidnapped by The Phone Company because The Phone Company wants him to convince the President to allow The Phone Company to implant chips in everyone's brains that will allow them to dial a phone number just by thinking it.
Here's a report from EE Times of Kurzweil's comments at the Fall Sensors Expo
BOSTON — Using deliberately provocative predictions, speech-recognition pioneer Ray Kurzweil said that by 2030 nanosensors could be injected into the human bloodstream, implanted microchips could amplify or supplant some brain functions, and individuals could share memories and inner experiences by "beaming" them electronically to others.
The NY Times also reported on Kurzweil's speech:
"We're limited to 100 trillion connections," said Mr. Kurzweil, alluding to current estimates of the processing power in the human brain. "I don't know about you, but I find that quite limiting."
You can go to Kurzweil's site and watch the AI Ramona perform.
As shown in this article about Children's Hospital in Boston heart-lung bypass machines are used as a way to allow tired organs to rest:
As he drove back to Children's in the dark, he ordered Jordan placed on a special heart-lung bypass machine that would give her heart a rest - and keep her alive - while doctors figured out what to do.
It was an emergency, and the cost - $3,600 a day - did not enter his mind. ''We try to do things as efficiently as we can,'' he said, ''but I am not prepared to cut corners in a life-and-death situation.''
The tests were done for a short period of time. Researcher Robert Bartlett next intends to repeat the tests for 30 day periods and hopes to try the first human tests in a year or two.
In early tests, sheep with damaged lungs were either put on standard therapy — the mechanical ventilator — or given the artificial lung plus a ventilator. All the sheep with the artificial lung survived, while some on the ventilator alone died, Bartlett says.
Additionally, the sheep using the device had only minimal use for the ventilator, suggesting that “it provided nearly 100 percent of their normal lung function,” he says.
Importantly, there were no deleterious effects. “The only problem was a sheep falling asleep and knocking it over,” he says with a smile.
One of uses for artificial organs will be to provide a patient with a way to stay alive while waiting for a new organ to be grow from the patient's own cells. Though the initial use of the artificial liver described below will be to keep patients alive while they wait for a suitable donor liver:
BERLIN – Four years after the first American clinical trial of an experimental artificial liver system began at the University of Michigan Health System, its leader says he is encouraged by the results thus far. And, he's optimistic about the system's potential to help more liver-failure patients stay alive until they receive a liver transplant, or recover without a transplant.
Already, says Robert Bartlett, M.D., 20 desperately ill patients at the U-M Health System have used the device in a phase I trial. Six patients went on to receive a transplant, three of whom are still alive. Two other patients recovered liver function without needing a transplant. Data on the first nine U-M patients were published last August in the journal Surgery; Bartlett discussed the full group at a meeting this week in Germany.
Results from Germany, where the system was invented, and from the three other American hospitals now testing it, also give Bartlett hope. In all, the system appears safe, able to reduce blood toxins, and able to reverse coma and shock.
The system, called albumin dialysis, uses special filters and proteins to remove toxic substances from the blood while sparing helpful compounds.
Bartlett spoke about the status of the albumin dialysis approach to liver support in the keynote and summary addresses at the Fourth International Symposium on Albumin Dialysis in Liver Disease this week in Rostock, Germany.