Trend Watch

Wednesday, March 5, 2008

Plucking Cells out of the Bloodstream

A new implantable device can extract stem cells for therapeutic transplant or program cancer cells to die.



Cell catcher: University of Rochester bioengineer Michael King holds up a section of plastic microtubing lined with proteins that trap cancer and stem cells.
Credit: Richard Baker, University of Rochester
Multimedia
Watch Michael King's new device in action.

Bioengineers have developed an implantable device that captures very pure samples of stem cells circulating in the blood. The device, a length of plastic tubing coated with proteins, could lead to better bone-marrow transplants and stem-cell therapies, and it also shows promise as a way to capture and reprogram cancer cells roaming the bloodstream. The company CellTraffix is commercializing the technology.

When patients get bone-marrow transplants, what they're really receiving are infusions of a type of adult stem cell. Bone-marrow-derived stem cells play a crucial role in renewing the blood throughout adulthood, creating new cells to carry oxygen and fight infections. These adult stem cells can be sampled using the new device.

The new device mimics a small blood vessel: it's a plastic tube a few hundred micrometers in diameter that's coated with proteins called selectins. The purpose of selectins in the body seems to be to slow down a few types of cells so that they can receive other chemical signals. A white blood cell, for instance, might be instructed to leave the circulation and enter a wound, where it would protect against infection. "Selectins cause [some] cells to stick and slow down," says Michael King, a chemical engineer at the University of Rochester who's developing the cell-capture devices. Different types of selectins associate with different kinds of cells, including platelets, bone-marrow-derived stem cells, and immune cells such as white cells.

In an upcoming publication in the British Journal of Hematology, King reports that selectin-coated microtubes implanted in rats can capture very pure samples of active stem cells from circulating blood. He gave a similar demonstration of stem-cell purification with samples taken from human bone marrow last year. Cancer patients often require bone-marrow transplants following harsh chemotherapy and radiation treatments that kill adult stem cells in the blood.

The purity of these transplants can be a matter of life or death. When the transplant is derived from the patient's own bone marrow--extracted before treatment--it's critical that it not contain any cancer cells. When it comes from another person, there's a chance that the donor's immune cells will attack the recipient if they're not filtered out. But current purification methods are slow and inefficient, King says. Those that rely on antibody recognition or cell size and shape typically extract only a small fraction of the stem cells in a blood sample; the rest go to waste.

Twenty-eight percent of the cells captured by King's implants were stem cells. "This is astounding given how rare they are in the bloodstream," says King. Implants would probably not be able to capture enough stem cells for transplant. But King believes that filtering a donor's blood through a long stretch of selectin-coated tubing outside the body, in a process similar to dialysis, would be very efficient. "This technique will clearly be useful outside the body" as a means of purifying bone-marrow-derived stem cells, says Daniel Hammer, chair of bioengineering at the University of Pennsylvania.

Hammer believes that King's devices will also have broader applications as implants that serve to mobilize a person's own stem cells to regenerate damaged tissues. By slowing down cells with selectins and then exposing them to other kinds of signals, says Hammer, King's devices "could capture stem cells, concentrate them, and differentiate them, without ever having to take the cells out of the body." There might be a way to use selectins to extract neural stem cells, too.


"This is a very broad-reaching discovery," says Hammer. Indeed, King says that he has already had some success using selectin coatings to reprogram cancer cells.

Cancer cells appear to highjack selectin pathways in order to spread to other parts of the body, the process known as metastasis. Tumors shed cells into the bloodstream. Some of those cells seem to exit with the help of selectins; ensconced in new tissue, they then establish new tumors. These secondary tumors cause more cancer deaths than initial tumors do.

King says he has unpublished work demonstrating that leukemia cells that roll along a coating of selectins and a cancer-specific signaling molecule will go through a process called programmed cell death. Healthy stem cells also roll across the device because they're attracted to the selectins, but the death signal doesn't affect them. Leukemia is a blood cancer, but King expects that the anticancer coating would work for solid tumors as well. Devices lined with these coatings might be implanted into cancer patients to prevent or slow metastasis.

King hopes to test antimetastasis implants in animals this year. He's collaborating with Jeffrey Karp, a bioengineer at the Harvard-MIT Division of Health Sciences and Technology, and Robert Langer, an MIT Institute Professor, to develop selectin coatings that are stable over months rather than days.

CellTraffix CEO Tom Fitzgerald says that the company's first product, a kit that will enable researchers to capture large numbers of stem and cancer cells in the lab, will likely reach the market early next year. The company hopes to begin clinical testing of the anticancer coatings by early 2010.

http://www.technologyreview.com/Biotech/20204/page1/

Improving Toxicity Tests

A new initiative will work on cell-based toxicity tests for chemicals.

Credit: Technology Review

As chemical companies develop more pesticides, cleaners, and other potentially toxic compounds, traditional methods of safety testing can hardly keep up. Animal tests, which have been the gold standard for decades, are slow and expensive, and these sorts of tests are increasingly socially unacceptable, too. What's more, the results of animal testing sometimes don't translate to humans, so researchers are eager for better alternatives.

This week, at the annual meeting of the American Association for the Advancement of Science in Boston, the U.S. Environmental Protection Agency and the National Institutes of Health (NIH) announced a multiyear research partnership to develop a cell-based approach that they hope can replace animal testing in toxicity screening. Work has already begun, although it will take years to refine the techniques.

Using systems that are already employed in the search for new drugs, researchers hope to develop quick, accurate methods of toxicity testing for chemicals that are carried out on cells, rather than on whole animals.

That way, instead of having to spend weeks dosing and dissecting roomfuls of rabbits or rats, thousands of chemicals could be tested in a matter of hours using automated systems and human cells grown in a lab. Different kinds of cells could be used as proxies for particular tissues, providing a way for researchers to test the effects of a chemical on the liver, for example, and, ultimately, to predict toxic effects.

The approach "really has the potential to revolutionize the way toxic chemicals are identified," says Francis Collins, director of the National Human Genome Research Institute. Automated cell-based tests could screen many thousands of chemicals in a single day, compared with the decades spent so far gathering detailed information on a few thousand toxic chemicals.

"We need to be able to test thousands of compounds in thousands of conditions much faster than we did before," says Elias Zerhouni, director of the NIH. The new approach repurposes a technique that's a mainstay in pharmaceutical labs, where high-throughput screening is used to help identify new drugs. Automated systems can test hundreds of thousands of candidate compounds in a single day and identify those that have any effect on cells, and hence may have therapeutic value. The aim of the toxicity-testing research is "to try to turn that around to find compounds that might be toxic," Collins says. Their effects could be assessed according to the number of cells they kill, or by using markers that indicate whether certain functions in a cell are affected.

Because high-throughput screening can handle many thousands of tests at a time, a given chemical can be tested at different concentrations and for different exposure times during a single screening process, producing comprehensive and reliable data that's "not a statistical approximation," says Christopher Austin, director of the NIH Chemical Genomics Center. "It's pharmacology."


"In order to get the answers you want, you need to do all the concentrations, all the times, and that's why you need to have a high-throughput system," Austin says.

Researchers at the NIH have already used high-throughput screening to test several thousand chemicals over a range of 15 concentrations varying by several orders of magnitude, and for exposure times ranging from minutes to days. The chemicals they picked have well-known toxic effects, gleaned from animal studies. By comparing data from high-throughput tests with that from animals, researchers should be able to fine-tune cell-based tests so that they're at least as reliable and as informative as animal experiments.

"Animals are not always giving us the right answer," says John Bucher, associate director of the National Toxicology Program, "so we need to use all the information we can get from different systems."

In a sense, Austin says, this new approach turns the animal-testing procedure "upside down." Rather than giving a rat a chemical and then dissecting the animal and examining its tissues to see the effect of the compound, metaphorically, "we are dissecting the rat first into its component cells, then computationally putting the rat back together."

However, it will take years for researchers to prove--if they can--that cell-based toxicity screening can supercede animal tests so "you cannot abandon animal testing overnight," Zerhouni says. "It will have to be intertwined for a few years."

http://www.technologyreview.com/Biotech/20294/page1/

Mobile Carriers See Gold in Femtocells

If consumers buy in to private wireless phone networks, the industry could save money.

Can you hear me now? Airvana's HubBub femtocell (above) could provide better cellular reception inside homes and offices.
Credit: Airvana

On its face, it sounds like a company's technological fantasy: a product sold to customers that will also save the business itself money.

That's roughly the attraction of a young wireless phone technology called femtocells, which promise to give homes and businesses their own private wireless phone networks.

Similar in concept to the Wi-Fi routers that many people use to blanket their homes with wireless Internet access, these little boxes instead provide a network for carrying the voice and high-speed data services of mobile phones. They're designed to give bandwidth-hungry cell-phone subscribers the strongest possible connections at home. But by keeping those customers off the main mobile network and using home broadband connections to transfer data, they could wind up saving the phone companies money, too.

It's no wonder, then, that equipment vendors say that mobile phone companies are rushing into this market--with technology and even commercial trials beginning on both sides of the Atlantic--even before standards have been set or final technological hurdles cleared.

"Usually in the networking business, you build equipment, and then drum up demand," says Paul Callahan, vice president of business development for Airvana, a femtocell equipment vendor. "This time, demand is already really strong."

The femtocell buzz is part of a broader, years-long push by mobile phone companies to persuade their customers to use cell phones instead of landlines for all their communications needs, and increasingly to use their cells for third-generation (3G) applications such as Web surfing, downloading music, and watching videos.

One hurdle, phone companies say, is that mobile phone coverage inside homes and businesses often isn't as good as it is outside. Some homes are in coverage shadows or have thick apartment walls that impede transmissions. In addition, the Wideband Code Division Multiple Access (W-CDMA) technology used for 3G services by T-Mobile and AT&T in the United States transmits at a higher frequency than does its predecessor, so it has a harder time penetrating walls.

A femtocell would relieve this problem--in theory. Instead of relying on the mobile phone's nearest cellular tower (known in the industry as a base station), which might also be serving scores of other callers at the same time, a customer would have her own private, high-quality cell-phone connection.

"Our goal is to get to a place where our services are available to all users at all times," says John Carvalho, head of core network innovation for Telefónica O2 Europe, which announced femtocell trials this week.

Boosters of the technology paint femtocell as technology that benefits everyone. Customers get a fast, reliable broadband phone connection at home, and the mobile phone companies get to offload a small piece of their infrastructure investments to their customers.

In effect, every customer who buys and installs his own home femtocell would reduce the load on the carrier's local macro network. The femtocell itself serves as an alternative base station, broadcasting and receiving ordinary wireless signals from cell phones that the femtocell owner permits. This is a strikingly attractive idea, particularly to carriers in big cities that find their networks often overloaded, and find that local regulations or public opinion makes it difficult and costly to set up new antennas.

By using a femtocell, customers will send their voice and data traffic out their own DSL, cable, or fiber connection to the Internet, and then to the carrier's network. This will also reduce the load on the land-based data networks that carry voice and data traffic from the mobile phone companies' base stations to their own central switching facilities. That, in turn, could translate into less infrastructure investment.

Yet all of this will happen only if customers see enough benefit to buying themselves a femtocell--and for now, that's the biggest flaw in this rosy scenario, analysts say.

"What's in it for the user?" asks Keith Nissen, an analyst with the In-Stat research firm. "That's the big question. Right now, there isn't enough."


Broadband subscribers already have fast Internet surfing at home, by definition. Carriers may well offer cheaper cell-phone calls for femto customers using their home connection--but broadband subscribers can already do this using Skype, Vonage, or other voice over Internet protocol (VoIP) services. Strong cell signals at home are certainly a plus, but it's not clear how much consumers will pay for this, analysts say.

Without an obvious consumer must-have attraction, demand will likely be tied closely to price, Nissen says. If a femtocell is cheap enough, consumers will latch on to the idea, assuming (and this can be a big assumption) that carriers are able to explain and market it clearly. But this price may be a sticking point for some time.

Today, the equipment cost for femtocells runs in the range of $250 to $300. Sprint, one of the first companies to start commercial trials of the products, is offering them to consumers in Denver and Indianapolis for $50 apiece, along with an offer of lower-priced calling plans--altogether a substantial subsidy.

O2's Carvalho says that he expects equipment costs to come down to between 50 and 80 British pounds (about $100 to $160) once standards are set and mass-manufacturing begins. That's an acceptable price range for consumers used to buying products such as Wi-Fi modems, he says.

The standards process may take several years, however. Different equipment vendors use different techniques for aspects such as security, or for letting the femtocells talk to the carrier's core network. Femtocells have been developed for both rival 3G mobile phone standards--W-CDMA and CDMA2000--but different standards-setting bodies are separately at work on rules for each.

In the long term, analysts expect femtocells to be a fast-growing, successful market. In-Stat forecasts that 40.6 million femtocells will be distributed around the world by 2011. ABI Research is even more optimistic, projecting 70 million in use by 2012.

By that time or shortly afterward, analysts say, femtocell technology may be built into other devices, such as Internet routers for consumers.

Vodafone, T-Mobile, and O2 all announced trials early this year. Equipment vendors say that many other carriers are in undisclosed trials as well. Commercial deployment, in which the products will be distributed to consumers by the phone companies or their retail partners beyond the limited scale of Sprint's two-city experiment, is expected by early next year.

That's all assuming that consumers react positively when they actually get a chance to see how the technology works.

"If it winds up being more expensive, but it provides better data rates, it's probably worth the investment for us," says O2's Carvalho. "If it's more expensive but slower, and it annoys customers, we probably wouldn't take that on."

http://www.technologyreview.com/Biztech/20293/page1/

Bandwidth on Demand

An academic internet provides clues about ways to improve the commercial Internet.

Big sender: Internet2’s dynamic circuit network will help provide channels for large quantities of information to flow to and from academic research projects, such as CERN’s hadron collider, above. In the future, the technology may find commercial applications, such as for fast transfer of high-definition online video.
Credit: CERN
Multimedia
Download PDF

Internet2, a nonprofit advanced networking consortium in the United States, is designing a new network intended to open up large amounts of dedicated bandwidth as needed. For example, a researcher wanting to test telesurgery technologies--for which a smooth, reliable Internet connection is essential--might use the network to temporarily create a dedicated path for the experiment. Called the dynamic circuit network, its immediate applications are academic, but its underlying technologies could one day filter into the commercial Internet, and it could be used, for example, to carry high-definition video to consumers.

"The idea here is to basically look at the network in a different way," says Rick Summerhill, CTO of Internet2. The Internet Protocol (IP) currently used for the Web breaks data into packets that are sent through fiber-optic cables to their ultimate destination. The packets don't have to take a common path through the network; routers act like way stations along the network, examining every packet individually and deciding where it should be sent next. The problem with this system is that large data transfers can clog the routers with packets waiting for direction, and if the packets don't make it to their final destination at the same time, the receiver may experience jitter--interruptions to the data stream that can produce skips in online video, for example.

Summerhill says that, using the dynamic circuit network, a researcher could set up a temporary connection to another location that would work like a phone call: the user's data would be carried directly to that other location, uninterrupted by the traffic of others sharing the network. The result is that large quantities of information could be transferred quickly and clearly.

The dynamic circuit network is really an enhancement of a traditional network, rather than a replacement. Internet2 still has a backbone that uses the standard IP common across the Web. What makes the dynamic circuit network different is that it uses a circuit-switched network, which can be set up so that all the packets follow the same path. Also, those circuits don't have to be in place permanently. Lachlan Andrew, a researcher at Netlab, at the California Institute of Technology, explains that a circuit-switched network determines a pathway for the entire stream of packets, so that at every way station, they can be sent on without having to be individually examined. "Internet2 is developing technology to communicate between nodes, find a path, and construct it," he says.

The idea of the dynamic circuit network, Summerhill says, is that these circuits can be set up on demand, so that traffic needing excellent quality of service can step out of the regular flow. Because data is sent down fiber-optic cables at different frequencies of light, he explains, data from the dynamic circuit network can coexist with IP data and wouldn't require new cable to be laid. Summerhill says that Internet2 is working on software that could eventually be built into network devices to control these different flows and to set up circuits when and where they are needed

Among the current applications for the dynamic circuit network, Internet2 expects to facilitate the transfer of data from CERN's large hadron collider to researchers at other institutions, and it has done trials in which circuits are opened between the collider and the University of Nebraska. In the future, Summerhill says, the researchers hope that commercial applications develop from the technology. "Think of a network that provided hundreds or thousands of high-definition channels and also provided on-demand video capabilities," he says. He foresees a commercial network that needs both high bandwidth and high quality of service, like some current academic requirements. "The methods for supporting that network are under investigation," Summerhill says. Although right now, there are no commercial implementations, he notes that Internet2 works with commercial partners that might eventually be a conduit to bringing the technology into the ordinary Internet.

Clive Davenhall worked on software for academic circuit-switched networks in the United Kingdom, as part of his role as an engineer at the National e-Science Centre, in Edinburgh, which works to improve methods for conducting large-scale science research over the Internet. Davenhall says that, although people have been talking about dynamic circuit networks for a long time, this type of network hasn't had much of an impact on the commercial Internet, partly because of concerns about how it might function in an environment less controlled than academia. For example, if the average person could set up a dedicated circuit on demand, it might be possible to hog resources that could interfere with other users' experience.

Summerhill says that the dynamic circuit network is still in its early stages, and "still has some evolution to do." He recalls the time that IP wasn't considered ready for commercial applications. So far, four universities in four different regional networks are connected to the dynamic circuit network, says Lauren Rotman, public relations manager for Internet2. Rotman adds that it will be easy to add universities in regions that are already connected. The organization hopes to increase the dynamic circuit network's reach significantly in the coming year.

http://www.technologyreview.com/Infotech/20277/page1/

Blu-ray Disc recordable

Blu-ray Disc recordable refers to two optical disc formats that can be recorded with an optical disc recorder. BD-R discs can be written to once, whereas BD-RE can be erased and re-recorded multiple times. Disc capacities are 25GB (23.31 GiB) for single layer discs and 50GB (46.61 GiB) for dual layer discs.[1]

As of January 2008, BD-R/RE drives up to 6x are sold for about US$400 and 2x single-layer BD-R discs,[2] with a capacity of 25 GB, can be found for as low as US$10. The theoretical maximum for Blu-ray Discs is about 12x, as the speed of rotation (10,000 rpm) causes too much wobble for the discs to be read properly. This is similar to the 20x and 52x respective maximum speeds of DVDs and CDs.

Version

There are three versions of rewritable Blu-ray Discs (BD-RE):[3]

Version 1.0

  • unique BD File System
  • not computer compatible

Version 2.0

  • UDF 2.5 file system for computer use
  • the use of AACS
  • BD-R Version 1.0 follows this specification

Version 3.0

  • camcorder (8 cm) discs added
  • backward compatible with Version 2.0
  • BD-R Version 2.0 follows this specification

Speed

Drive speed Data rate Single layer BD write time
1X[1] 36 Mbit/s 4.5 MiB/s 95 min.
2X 72 Mbit/s 9 MiB/s 47 min.
4X 144 Mbit/s 18 MiB/s 24 min.
8X 288 Mbit/s 36 MiB/s 12 min.
12X 432 Mbit/s 54 MiB/s 8 min.


Optical disc authoring

Optical media types

Standards
Further reading



A blank rewritable Blu-ray disc (BD-RE)
A blank rewritable Blu-ray disc (BD-RE)



http://en.wikipedia.org/wiki/Blu-ray_Disc_recordable

Federal Research Funding Cut

Financial support for a major international fusion project is one of many casualties.

Cooled fusion: The United States has stopped funding research for an international fusion-reactor project called ITER.
Credit: ITER and Technology Review

It was supposed to be a year bringing sharp increases in federal funding for physical-sciences research. Instead, as a result of the final appropriations bill signed a few weeks ago by Congress, fiscal year 2008 (the federal fiscal year runs October 1 to September 30) brought cuts that will cause hundreds of researchers to lose their jobs, and it's putting the future of two important international projects in jeopardy, including one to make a large-scale fusion demonstration facility.

For most of 2007, as Congress and the Bush administration debated the federal budget, support was strong from both parties for significantly increasing funding for three federal agencies that support the lion's share of basic research in the physical sciences: the National Science Foundation, the National Institute of Standards and Technology, and the Department of Energy's (DOE) Office of Science. Indeed, the president's proposed budget included increased funding for these agencies, as part of a plan to double investment in physical-sciences research over the next 10 years. And early appropriations bills met these targets. But veto threats and one actual veto related to a cap on domestic spending imposed by President Bush kept these bills from becoming law.

Instead, a catch-all appropriations bill was passed in late December, with last-minute cuts that eliminated not only the proposed increases to these agencies, but also funding for some programs within these agencies. The cuts caught researchers by surprise just before the holidays and sent directors of at least two national labs scrambling to find ways to deal with the unexpected shortfalls. As a result of the cuts, hundreds of researchers at Fermilab, in Batavia, IL, and at the Stanford Linear Accelerator Center (SLAC), in Menlo Park, CA, will be laid off.

What's more, two international projects will receive no funding at all for the remainder of the fiscal year. One endeavor, the International Linear Collider project, is being designed to answer some fundamental questions about the universe, such as those concerning the nature of dark matter. While funding could be restored in the future, layoffs will mean that the labs involved could lose key technical staff, says Persis Drell, the director of SLAC. She says that a particle collider at the lab will also have to shut down due to lack of funds, which will mean that the lab must back out of some international commitments.

"It pains me greatly that at a time when particle physics needs to be ever more international, the political process in the U.S. has resulted in real damage to the relationships with our international partners," Drell said in a speech to the researchers and staff at her lab.

Another important project, a proposed demonstration of nuclear fusion--called ITER--was slated to receive $160 million in federal funding this year; instead, it received no funding. ITER will consist of a 500-megawatt fusion reactor, to be built in the South of France, with which researchers will attempt to demonstrate that fusion can be a practical source of electricity. If all goes well, results of the project will be used to design the first commercial fusion power plants. Fusion projects in general have been delayed in part because of intermittent funding, says Ian Hutchinson, the head of the Department of Nuclear Science and Engineering at MIT. The ITER project is taking up where research left off in the early 1990s, the last time funding dropped off. If funding had been constant, Hutchinson says, "we could have been at this stage 10 years ago." He calls the current cuts a "complete disaster" in terms of the message it conveys to the international community. "It's completely reversing ourselves from what we've been saying the last four years," he says, given that United States officials have publicly supported the project.

The ITER project could go on without support from the United States, but it will move forward more slowly, Hutchinson says, and when the facility is complete, researchers in this country won't have timely access to the results. He hopes that in the coming year, "cooler heads will prevail" and the funding for ITER will be restored.

The appropriations bill is not bad news across the board for research and development, but it does favor short-term development, which often comes at the expense of long-term research. For example, the DOE overall received an increase in funding compared with both last year and the president's request. But the Office of Science--the basic research arm of the agency--saw nearly a half-billion-dollar cut compared with appropriations bills in Congress earlier in the year. In the DOE, some programs that were slated to be cut in the president's budget will continue to receive funding, such as research on geothermal and hydroelectric energy. Eliminating these proposed cuts added to the overall budget and led to cuts elsewhere.

The cuts in research funding have researchers and organizations such as the American Physical Society calling for Congress to push through new funding this year. But many, including Drell, are preparing for more difficult times ahead: they're anticipating similar budget shortcomings next year.

http://www.technologyreview.com/Biztech/20085/?a=f









Controlling Cell Behavior with Magnets

Nanoparticles allow researchers to initiate biochemical events at will.

Cell switch: Immune cells coated with nanoparticles take up calcium in the presence of a magnetic field. Each nanoparticle measures approximately 30 nanometers in diameter. In this image, yellow cells are taking up calcium in response to a localized magnetic field. Cells that are farther away from the field are shown in purple and do not take up calcium.
Credit: Donald Ingber, Harvard Medical School

For the first time, researchers have demonstrated a means of controlling cell functions with a physical, rather than chemical, signal. Using a magnetic field to pull together tiny beads targeted to particular cell receptors, Harvard researchers made cells take up calcium, and then stop, then take it up again. Their work is the first to prove that such a level of control over cells is possible. If the approach can be used with many cell types and cell functions, it could lead to a totally new class of therapies that rely on cells themselves to make and release drugs.

The research, which appeared in the journal Nature Nanotechnology, was led by Donald Ingber, professor of pathology at Harvard Medical School and cochair of the Harvard Institute for Biologically Inspired Engineering. Ingber's group demonstrated its method for biomagnetic control using a type of immune-system cell that mediates allergic reactions. Targeted nanoparticles with iron oxide cores were used to mimic antigens in vitro. Each is attached to a molecule that in turn can attach to a single receptor on an immune cell. When Ingber exposes cells bound with these particles to a weak magnetic field, the nanoparticles become magnetic and draw together, pulling the attached cell receptors into clusters. This causes the cells to take in calcium. (In the body, this would initiate a chain of events that leads the cells to release histamine.) When the magnetic field is turned off, the particles are no longer attracted to each other, the receptors move apart, and the influx of calcium stops.

"It's not the chemistry; it's the proximity" that activates such receptors, says Ingber.

The approach could have a far-reaching impact, as many important cell receptors are activated in a similar way and might be controlled using Ingber's method.

"In recent years, there has been a realization that physical events, not just chemical events, are important" to cell function, says Shu Chien, a bioengineer at the University of California, San Diego. Researchers have probed the effects of physical forces on cells by, for example, squishing them between plates or pulling probes across their surfaces. But none of these techniques work at as fine a level of control as Ingber's magnetic beads, which act on single biomolecules.

"Up to now, there hasn't been much control [over cells] at this scale," says Larry Nagahara, project manager in the National Cancer Institute's Alliance for Nanotechnology in Cancer and a physics professor at Arizona State University.


Many drugs, from anticancer antibodies to hormones, work by activating cell receptors. Once a hormone is in the blood, however, there's no turning it on or off. "This shows that you can turn on and off the signal, and that you can do it instantly," says Christopher Chen, a bioengineer at the University of Pennsylvania. "That's something that's hard to do, for example, with an antibody."

Ingber has many ideas for devices that might integrate his method of cellular control. Magnetic pacemakers could use cells instead of electrodes to send electrical pulses to the heart. Implantable drug factories might contain many groups of cells, each of which makes a different drug when activated by a magnetic signal. Biomagnetic control might lead to computers that can take advantage of cells' processing power. "Cells do complex things like image processing so much better than computers," says Ingber. Ingber, who began the project in response to a call by the Defense Advanced Research Projects Agency for new cell-machine interfaces, acknowledges that his work is in its early stages. In fifty years, however, he expects that there will be devices that "seamlessly interface between living cells and machines."

Other researchers agree. Ingber's biomagnetic control "may represent a new mechanism for man-machine interfaces," says UC San Diego's Chien. But before such interfaces can be developed, says University of Pennsylvania engineer Chen, researchers need to learn a lot more about cells.

"Say we have cells on a chip and we know what behavior we want to elicit," such as getting a stem cell to enter a wound site and initiate repairs, says Chen. "We don't know what signaling events have to happen to put the cell into the right state" so that it will take the desired action.

In the short term, Chen says that Ingber's method could help biologists gain crucial knowledge about cell signaling, such as how these signals are processed chemically and physically by the cell, and how they lead to particular outcomes, from calcium uptake to changes in gene expression. "It provides a tool that lets us tweak the cell and see what happens," says Chen.

http://www.technologyreview.com/Nanotech/20087/page2/

How Important Is the Latest Cloning Feat?

Scientists have generated early-stage cloned human embryos, but not stem cells.

Human cloning: Shown here is a three-day old cloned embryo, created from a donor egg and the skin cell of an adult male.
Credit: Stemagen

Scientists at Stemagen, a small biotechnology company in La Jolla, CA, reported yesterday that they have for the first time generated cloned human blastocysts--early-stage embryos--from adult skin cells. This is the first step in generating stem cell lines matched to individuals, which are crucial for creating new cellular models of disease and potentially important for future tissue replacement therapies. (See "Next Steps for Stem Cells" and "The Real Stem Cell Hope".) The new findings also confirm that access to fresh eggs from healthy young donors is a key part of successful cloning. Lack of access to human eggs has been the major barrier in the field. (See "Human Therapeutic Cloning at a Standstill".)

Cloned blastocysts have been generated before, but from embryonic stem cells rather than from adult cells. Scientists theorize that embryonic stem cells are easier to turn into blastocysts because of their earlier developmental stage.

Experts in the field have had a mixed reaction to the new work. "It's a nice achievement, but in my view, they haven't crossed the bar," says Evan Snyder, director of the Stem Cells and Regenerative Medicine Program at the Burnham Institute in La Jolla. "The real test will be, can you generate cell lines that are stable and self-renewing and normal?" Others applaud the confirmation of the feasibility of human cloning. "The fact that it can be done is important," says Jeanne Loring, a stem cell scientist at the Scripps Research Institute in La Jolla. "It wipes away that blot on our scientific integrity," she says, referring to a massive fraud unveiled in 2005 in which South Korean scientist Woo Suk Hwang claimed to have generated stem cell lines from cloned human embryos. (See "Stem Cells Reborn".)


To clone an embryo, a process also called nuclear transfer, scientists first strip an egg of its genetic material. Then they insert DNA from an adult cell, such as a skin cell, into the egg. Through an unknown process, the egg turns back the clock on the adult DNA and begins to develop as a normally fertilized egg would. From the embryo, researchers could theoretically collect a specialized ball of cells that can be coaxed to turn into stem cells. So far, however, no one has successfully performed this feat.

Stemagen, a relatively unknown player in the field, probably owes its success to access to human eggs through a close association with a local fertility clinic. (The company was founded by a fertility specialist at the Reproductive Sciences Center in La Jolla.) "We were able to get access to high-quality oocytes and have them in the incubator within one to two hours," says Andrew French, Stemagen's chief scientific officer.

Egg donors and the intended parents gave eggs in excess of those needed for in vitro fertilization to the Stemagen scientists for research. Regulations in many states prohibit compensation for donated eggs for ethical reasons, a requirement that has slowed other cloning efforts

Starting with 25 fresh oocytes, French and colleagues generated five blastocysts--five- to six-day-old embryos consisting of 30 to 70 cells. Rather than attempting to generate stem cell lines from the embryos, the researchers sent them to an independent company for genetic confirmation of their results. "They showed we had completely removed the DNA from the egg donor and replaced it with DNA from the skin-cell donor," says French. One blastocyst was confirmed as a clone via two DNA-fingerprinting methods, while genetic analysis of two others indicated the likelihood that they were clones.

The next crucial step will be generating stem cell lines from cloned embryos, which many stem cell scientists speculate will be the most challenging step. "That's likely where Hwang failed," says Synder.

French and colleagues are planning such experiments, with results potentially in the next eight to twelve months. "The quality of our blastocysts improved with each experiment," says French. Based on the success rate of previous attempts to make stem cells from regular embryos, he estimates that Stemagen will be able to generate a stem cell line from between five and ten cloned embryos and report the results in the next year. The company aims to sell or license the lines to pharmaceutical companies and others who would use them to test new drugs or develop new therapies.

While human therapeutic cloning has always been an ethically contentious area of research--partly because it requires the creation and destruction of human embryos--it has recently come under greater fire. After the announcement of new techniques for reprogramming adult cells so that they turn into stem cells without first forming embryos, some opponents called for a halt on embryonic-stem-cell research. (See "Stem Cells without the Embryos".)

However, researchers in the field emphasize the need to pursue all reprogramming techniques. "Even though there are other techniques to reprogram a cell that have gotten a lot of press, we still don't know how those compare with the reprogramming you actually see with nuclear transfer," says Snyder. "My feeling is, if we understand nuclear transfer better, we will be able to do the other kind of reprogramming more efficiently."

http://www.technologyreview.com/Biotech/20088/

Sony KDL-46D3500 Review

Sony KDL-40D3000

46in LCD
Picture
Sound
Features
Usability
Value
Pristine HD pictures tempered by slightly disappointing SD pictures.
HD Ready: yes
Resolution: 1920 x 1080



Design

The KDL-46D3500 is the embodiment of Sony's design philosophy with a chic matte black understated presence that simply oozes class. Build quality is back to its very best with the Sony looking like it could have been sculpted from a solid block of metal.

Features

Unlike the KDL-40D3500, there is no corresponding 3000 model alongside the KDL-46D3500 in the UK. If you are familiar with the D3000 series from Sony it is worth noting that the 46D3500's spec sheet reads a little different than you would imagine.

Screen: 46in 16:9
Tuner:Digital
Sound System: Nicam
Resolution: 1920 x 1080
Contrast Ratio: 1800:1 (16,000 dynamic)
Brightness: 500cd/m2
Other Features: Bravia Picture Processing Engine, Live Colour Creation, 24p True Cinema.
Sockets: 2 HDMI, 2 SCART, Component Video, Composite Video, PC input.

Sony currently offer a huge range of LCD TV's and the number of different models can seem quite bewildering to those of you who are looking to buy a new LCD TV. The D3500 sits between the slightly higher spec V3000 series and directly above the slightly lower spec T3500 line.

Essentially, the D3500 gains 'True Cinema' over the T3500 but comes equipped with a slightly less sophisticated version of Sony's 'Bravia' picture processing engine than the V3000.

The 46D3500 comes equipped a Full HD (1920 x 1080) resolution which can potentially give a marked improvement in the display of sources such as Sky Tv (1080i). The 1080 lines of resolution match the resolution of the screen negating the need for any picture scaling to fit. If you have a device which outputs pictures in the superior 1080p (e.g. Sony's PlayStation 3) the 3500 can accept those pictures in their full glory.

There is no 'Motionflow +100Hz' technology on the 46D3500 (featured on the 40D3000) which doubles the number of frames shown from 50 to 100 by interpolating an extra frame in between each source frame.

The KDL-46D3500 is equipped with '24p True Cinema' which enables the panel to display films at their intended 24fps (frames per second).

Alongside 24p True Cinema is Sony's 'Theatre Mode' technology which adjusts colour, contrast and brightness settings to makes movies look as authentic as the original.

It is worth mentioning that the 24p mode comes into its own with High Definition (Blu-ray or HD DVD) players which allow you to play movies at their original speed. The original 'cine' film is generally recorded at 24 frames per second, which in the absence of '24p True Cinema' is speeded up to 25 (standard for most TV's) frames per second with an accompanying increase in audio pitch.

Colour reproduction on the KDL-46D3500 should offer smoother transitions than previous Sony LCD's with a new 10-bit panel offering 1024 shades of gradation.

Theatre Sync, which is Sony's name for CEC (Consumer Electronic Control), is a control standard that functions over HDMI 1.3. The technology facilitates one-touch control over compatible devices and in practice means that if you fire up your compatible DVD player, the all connected devices such as your LCD TV will also spring into life.

Sonically, the KDL-46D3500 comes equipped with Sony's S-Force Front Surround which is their latest virtual surround sound technology.

Performance

High Definition (HD) is where the Sony KDL-46D3500 excels. Hook up a 1080p capable source and you will be treated with absolutely pristine pictures. The KDL-46D3500 displays a clarity and sharpness that make you want to reach out and touch objects or people as they glide across the screen. Colours are wonderfully vibrant and reach a level of authentic realism to match any LCD.

Although black levels are still behind the best that plasma can offer, the KDL-46D3500 has made great strides in this area from previous Sony's. Shadow detailing now takes on a subtlety which is a match for any 46in LCD currently out there.

Standard Definition (SD) performance suffers to a degree from some of the inconsistencies that creep into a picture as a result of the conversion of a 576p source to an HD ready screen configuration, especially with such a large screen. SD performance is nevertheless very good, and quite an accomplishment for a 46in LCD.

The SD performance can be a little 'grainy' at times with some noticeable degradation in picture quality with faster motion sequences. Simply as a result of the extra size, the KDL-46D3500 can't quite live up to the performance of its smaller 40D3500 brother, but was better than we expected.

Finally, if there is a 46in LCD TV out there with a richer or more precise colour palette, we have yet to see it. The range, depth and subtlety in this respect is simply outstanding. The most intricate of detailing such as skin tone is realised with class leading performance.

Conclusion

The Sony KDL-46D3500 is a another highly accomplished performer when it comes to High Definition material. However, if SD viewing is just as important there are better performers out there.

http://www.hdtvorg.co.uk/reviews/lcd/sony_kdl-46d3500.htm

Microsoft Tests Memory-Making Camera

The SenseCam is no tourist's point-and-shoot, but may help give Alzheimer's patients a photographic assist.


A digital camera developed by Microsoft is undergoing testing, but you won't see it in any stores soon.

Over the past several years at its research facility in Cambridge, England, the company created a wearable digital camera called the SenseCam. The camera's software is designed to take a low-resolution photo every 30 seconds while dangling from its wearer.

The SenseCam has received increasing attention in the medical field as an experimental tool to help those with memory problems, such as Alzheimer's disease. In 2005 the first trials began, and over time, the SenseCam has been used to help those with more severe memory problems, said Emma Berry, a neuropsychologist at Addenbrooke's Hospital in Cambridge.

Patient Tests SenseCam

Berry has been working recently with a 68-year-old Cambridge woman, "Mrs. F," who was diagnosed 12 years ago with severe memory impairment. For example, if Mrs. F goes to an art exhibit in the morning, she will not remember the activity the next day, Berry said.

Mrs F. wears the SenseCam on a lanyard around her neck when she and Berry do an activity. The SenseCam will take hundreds of images with its fish-eye lens, which provides a wide-angle view. Then, every two days for two weeks, Mrs. F reviews the images.

"At the end of the two weeks, she has a fantastic recollection of the event," Berry said. "What seems to happen is that when she looks at the images, some images don't bring to mind the events at all, but one or two of the images or maybe 10 of the images will bring it all back to her."

A key factor seems to be the quantity of images, since different images and scenes are more significant for some people than others, Berry said. For one person, the color of another person's shoes captured in an image may be enough to trigger wider recollections, she said.

SenseCam can take plenty of images. It has a 1 G-byte SD memory card and can shoot as many as 30,000 640-by-480 pixel images at Video Graphics Array quality. That spec isn't very impressive compared to today's digital cameras, but it's enough to be useful to jog memory, said Steve Hodges, who manages the SenseCam project at Microsoft Research in Cambridge.

"It's remarkable how it appears to trigger your memory for that event," Hodges said. "It seems to bring you back to that original moment."

Tailored Features

SenseCam holds advantages over video recorders, Hodges said. The device is less intrusive for the user to wear, and the snapshots can be viewed at a faster pace later, allowing a person to get to the significant images rather than watching a video clip in real time. SenseCam's battery will last more than a day, and its user must download the images every couple of days.

SenseCam's image-viewing software is easy enough for elderly people to manage and designed to display images in a flip-book fashion, Hodges said. Similar to other photo-viewing software, a person can choose how quickly they want to play back the photos, he said.

The device has other features tailored to its purpose. It will interrupt its 30-second intervals to take a photo when it senses a sudden change in lighting or heat. It's equipped with a passive infrared sensor that can detect when another person is close and can take a photo.

So far, Microsoft isn't working on advancing the hardware specifications and instead is concentrating on engaging the medical community, Hodges said. Microsoft has no plans to commercialize SenseCam, but it has provided US$550,000 in funding for medical research projects using it.

Researchers are still a long way from understanding how memory works. Duke University in Durham, North Carolina, and the University of Leeds in England have a research project underway using the SenseCam to study autobiographical memory, or how people remember events over their lifetime.

"The jury is out over what part of our brains are involved in autobiographical memory," Berry said.

http://www.pcworld.com/article/id,141568-c,researchreports/article.html#

Human chromosomes

Human cells have 23 pairs of large linear nuclear chromosomes, giving a total of 46 per cell. In addition to these, human cells have many hundreds of copies of the mitochondrial genome. Sequencing of the human genome has provided a great deal of information about each of the chromosomes. Below is a table compiling statistics for the chromosomes, based on the Sanger Institute's human genome information in the Vertebrate Genome Annotation (VEGA) database.[42] Number of genes is an estimate as it is in part based on gene predictions. Total chromosome length is an estimate as well, based on the estimated size of unsequenced heterochromatin regions.

Chromosome ↓ Genes ↓ Total bases ↓ Sequenced bases[43] ↓
1 3,148 247,200,000 224,999,719
2 902 242,750,000 237,712,649
3 1,436 199,450,000 194,704,827
4 453 191,260,000 187,297,063
5 609 180,840,000 177,702,766
6 1,585 170,900,000 167,273,992
7 1,824 158,820,000 154,952,424
8 781 146,270,000 142,612,826
9 1,229 140,440,000 120,312,298
10 1,312 135,370,000 131,624,737
11 405 134,450,000 131,130,853
12 1,330 132,290,000 130,303,534
13 623 114,130,000 95,559,980
14 886 106,360,000 88,290,585
15 676 100,340,000 81,341,915
16 898 88,820,000 78,884,754
17 1,367 78,650,000 77,800,220
18 365 76,120,000 74,656,155
19 1,553 63,810,000 55,785,651
20 816 62,440,000 59,505,254
21 446 46,940,000 34,171,998
22 595 49,530,000 34,893,953
X (sex chromosome) 1,093 154,910,000 151,058,754
Y (sex chromosome) 125 57,740,000 22,429,293

http://en.wikipedia.org/wiki/Chromosome

Chromosomal aberrations

The three major single chromosome mutations; deletion (1), duplication (2) and inversion (3).
The three major single chromosome mutations; deletion (1), duplication (2) and inversion (3).
The two major two-chromosome mutations; insertion (1) and translocation (2).
The two major two-chromosome mutations; insertion (1) and translocation (2).
In Down syndrome, there are three copies of chromosome 21
In Down syndrome, there are three copies of chromosome 21

Chromosomal aberrations are disruptions in the normal chromosomal content of a cell, and are a major cause of genetic conditions in humans, such as Down syndrome. Some chromosome abnormalities do not cause disease in carriers, such as translocations, or chromosomal inversions, although they may lead to a higher chance of having a child with a chromosome disorder. Abnormal numbers of chromosomes or chromosome sets, aneuploidy, may be lethal or give rise to genetic disorders. Genetic counseling is offered for families that may carry a chromosome rearrangement.

The gain or loss of chromosome material can lead to a variety of genetic disorders. Human examples include:

  • Cri du chat, which is caused by the deletion of part of the short arm of chromosome 5. "Cri du chat" means "cry of the cat" in French, and the condition was so-named because affected babies make high-pitched cries that sound like a cat. Affected individuals have wide-set eyes, a small head and jaw and are moderately to severely mentally retarded and very short.
  • Wolf-Hirschhorn syndrome, which is caused by partial deletion of the short arm of chromosome 4. It is characterized by severe growth retardation and severe to profound mental retardation.
  • Down's syndrome, usually is caused by an extra copy of chromosome 21 (trisomy 21). Characteristics include decreased muscle tone, asymmetrical skull, slanting eyes and mild to moderate mental retardation.
  • Edwards syndrome, which is the second most common trisomy after Down syndrome. It is a trisomy of chromosome 18. Symptoms include mental and motor retardation and numerous congenital anomalies causing serious health problems. Ninety percent die in infancy; however, those who live past their first birthday usually are quite healthy thereafter. They have a characteristic hand appearance with clenched hands and overlapping fingers.
  • Patau Syndrome, also called D-Syndrome or trisomy-13. Symptoms are somewhat similar to those of trisomy-18, but they do not have the characteristic hand shape.
  • Idic15, abbreviation for Isodicentric 15 on chromosome 15; also called the following names due to various researches, but they all mean the same; IDIC(15), Inverted dupliction 15, extra Marker, Inv dup 15, partial tetrasomy 15
  • Jacobsen syndrome, also called the terminal 11q deletion disorder.[1] This is a very rare disorder. Those affected have normal intelligence or mild mental retardation, with poor expressive language skills. Most have a bleeding disorder called Paris-Trousseau syndrome.
  • Klinefelter's syndrome (XXY). Men with Klinefelter syndrome are usually sterile, and tend to have longer arms and legs and to be taller than their peers. Boys with the syndrome are often shy and quiet, and have a higher incidence of speech delay and dyslexia. During puberty, without testosterone treatment, some of them may develop gynecomastia.
  • Turner syndrome (X instead of XX or XY). In Turner syndrome, female sexual characteristics are present but underdeveloped. People with Turner syndrome often have a short stature, low hairline, abnormal eye features and bone development and a "caved-in" appearance to the chest.
  • XYY syndrome. XYY boys are usually taller than their siblings. Like XXY boys and XXX girls, they are somewhat more likely to have learning difficulties.
  • Triple-X syndrome (XXX). XXX girls tend to be tall and thin. They have a higher incidence of dyslexia.
  • Small supernumerary marker chromosome. This means there is an extra, abnormal chromosome. Features depend on the origin of the extra genetic material. Cat-eye syndrome and isodicentric chromosome 15 syndrome (or Idic15) are both caused by a supernumerary marker chromosome, as is Pallister-Killian syndrome.

Chromosomal mutations produce changes in whole chromosomes (more than one gene) or in the number of chromosomes present.

  • Deletion - loss of part of a chromosome
  • Duplication - extra copies of a part of a chromosome
  • Inversion - reverse the direction of a part of a chromosome
  • Translocation - part of a chromosome breaks off and attaches to another chromosome

Most mutations are neutral - have little or no effect

A detailed graphical display of all human chromosomes and the diseases annotated at the correct spot may be found at [2].

http://en.wikipedia.org/wiki/Chromosome

study of karyotypes is part of cytogenetics.

Main article: Karyotype
Figure 3: Karyogram of a human male
Figure 3: Karyogram of a human male

In general, the karyotype is the characteristic chromosome complement of a eukaryote species.[36] The preparation and study of karyotypes is part of cytogenetics.

Although the replication and transcription of DNA is highly standardized in eukaryotes, the same cannot be said for their karotypes, which are often highly variable. There may be variation between species in chromosome number and in detailed organization. In some cases there is significant variation within species. Often there is variation 1. between the two sexes. 2. between the germ-line and soma (between gametes and the rest of the body). 3. between members of a population, due to balanced genetic polymorphism. 4. geographical variation between races. 5. mosaics or otherwise abnormal individuals. Finally, variation in karyotype may occur during development from the fertilised egg.

The technique of determining the karyotype is usually called karyotyping. Cells can be locked part-way through division (in metaphase) in vitro (in a reaction vial) with colchicine. These cells are then stained, photographed and arranged into a karyogram, with the set of chromosomes arranged, autosomes in order of length, and sex chromosomes (here XY) at the end: Fig. 3.

Like many sexually reproducing species, humans have special gonosomes (sex chromosomes, in contrast to autosomes). These are XX in females and XY in males.

Historical note

Investigation into the human karyotype took many years to settle the most basic question: how many chromosomes does a normal diploid human cell contain? In 1912, Hans von Winiwarter reported 47 chromosomes in spermatogonia and 48 in oogonia, concluding an XX/XO sex determination mechanism.[37] Painter in 1922 was not certain whether the diploid number of man was 46 or 48, at first favouring 46.[38] He revised his opinion later from 46 to 48, and he correctly insisted on man having an XX/XY system.[39] Considering their techniques, these results were quite remarkable.

New techniques were needed to definitively solve the problem:

1. Using cells in culture
2. Pretreating cells in a hypotonic solution, which swells them and spreads the chromosomes
3. Arresting mitosis in metaphase by a solution of colchicine
4. Squashing the preparation on the slide forcing the chromosomes into a single plane
5. Cutting up a photomicrograph and arranging the result into an indisputable karyogram.

It took until the mid 1950s until it became generally accepted that the karyotype of man included only 46 chromosomes.[40][41] Rather interestingly, chimpanzees (our closest living relatives) have 48 chromosomes

http://en.wikipedia.org/wiki/Chromosome

Trend Watch