Sunday, October 14, 2012

SpaceX cargo ship reaches International Space Station

The SpaceX Falcon 9 rocket lifts off from Space Launch Complex 40 at the Cape Canaveral Air Force Station in Cape Canaveral, Florida October 7, 2012. REUTERS / Michael Brown
Astronauts plucked a commercial cargo ship from orbit on Wednesday and attached it to the International Space Station, marking the reopening of a U.S. supply line to the orbital outpost following the space shuttles' retirement last year.

After a 2-1/2 day trip, Space Exploration Technologies' Dragon cargo ship positioned itself 33 feet away from the $100 billion research complex, a project of 15 countries, which has been dependent on Russian, European and Japanese freighters for supplies.

Astronaut Akihiko Hoshide then used the space station's 58-foot-long (17.7-meter) robotic arm to grab hold of a grapple fixture on the side of the capsule at 6:56 a.m. EDT (1056 GMT) as the spacecraft flew 250 miles above the Pacific Ocean, off the coast of Baja California in northwest Mexico.

"Looks like we tamed the Dragon," commander Sunita Williams radioed to Mission Control in Houston.

"We're happy she's on board with us. Thanks to everybody at SpaceX and NASA for bringing her here to us. And the ice cream," she said.

The Dragon's cargo includes a freezer to ferry science samples back and forth between the station and Earth. For the flight up, it was packed with chocolate-vanilla swirl ice cream, a rare treat for an orbiting crew.

Williams and Hoshide attached the capsule to a docking port on the station's Harmony connecting module at 9:03 a.m. EDT (1303 GMT).

It is expected to remain docked to the station for about 18 days while the crew unloads its 882 pounds (400 kg) of cargo and fills it with science experiments and equipment no longer needed on the outpost.

The flight is the first of 12 planned under a $1.6 billion contract NASA placed with privately owned Space Exploration Technologies, or SpaceX, to deliver cargo to the station.

The U.S. space agency's second supplier, Orbital Sciences Corp, plans to debut its Antares rocket later this year. A demonstration run to the station is planned for February or March.

NASA also is working with SpaceX, Boeing Co and privately owned Sierra Nevada Corp to design space taxis that can fly crew to and from the station, with the goal of breaking Russia's monopoly on those flights by 2017.

View the original article here
Read More...

Saturday, October 13, 2012

Redefining Medicine With Apps and iPads - The Digital Doctor

As a third-year resident in internal medicine, Dr. Rajkomar was the senior member of the team, and the others looked to him for guidance. An infusion of saline was the answer, but the tricky part lay in the details. Concentration? Volume? Improper treatment could lead to brain swelling, seizures or even death. Dr. Rajkomar had been on call for 24 hours and was exhausted, but the clinical uncertainty was “like a shot of adrenaline,” he said. He reached into a deep pocket of his white coat and produced not a well-thumbed handbook but his iPhone. With a tap on an app called MedCalc, he had enough answers within a minute to start the saline at precisely the right rate. The history of medicine is defined by advances born of bioscience. But never before has it been driven to this degree by digital technology. The proliferation of gadgets, apps and Web-based information has given clinicians — especially young ones like Dr. Rajkomar, who is 28 — a black bag of new tools: new ways to diagnose symptoms and treat patients, to obtain and share information, to think about what it means to be both a doctor and a patient. And it has created something of a generational divide. Older doctors admire, even envy, their young colleagues’ ease with new technology. But they worry that the human connections that lie at the core of medical practice are at risk of being lost. “Just adding an app won’t necessarily make people better doctors or more caring clinicians,” said Dr. Paul C. Tang, chief innovation and technology officer at Palo Alto Medical Foundation in Palo Alto, Calif. “What we need to learn is how to use technology to be better, more humane professionals.” Dr. Paul A. Heineken, 66, a primary care physician, is a revered figure at the San Francisco V.A. Medical Center. He is part of a generation that shared longstanding assumptions about the way medicine is practiced: Physicians are the unambiguous source of medical knowledge; notes and orders are written in paper records while standing at the nurses’ station; and X-rays are film placed on light boxes and viewed over a radiologist’s shoulder. One recent morning, while leading trainees through the hospital’s wards, Dr. Heineken faced the delicate task of every teacher of medicine — using the gravely ill to impart knowledge. The team arrived at the room of a 90-year-old World War II veteran who was dying — a ghost of a man, his face etched with pain, the veins in his neck protruding from the pressure of his failing heart. Dr. Heineken apologized for the intrusion, and the patient forced a smile. The doctor knelt at the bedside to perform the time-honored tradition of percussing the heart. “Do it like this,” he said, placing his left hand over the man’s heart, and tapping its middle finger with the middle finger of his right. One by one, each trainee took a turn. An X-ray or echocardiogram would do the job more accurately. But Dr. Heineken wanted the students to experience discovering an enlarged heart in a physical exam. Dr. Heineken fills his teaching days with similar lessons, which can mean struggling upstream against a current of technology. Through his career, he has seen the advent of CT scans, ultrasounds, M.R.I.’s and countless new lab tests. He has watched peers turn their backs on patients while struggling with a new computer system, or rush patients through their appointments while forgetting the most fundamental tools — their eyes and ears. For these reasons, he makes a point of requiring something old-fashioned of his trainees. “I tell them that their first reflex should be to look at the patient, not the computer,” Dr. Heineken said. And he tells the team to return to each patient’s bedside at day’s end. “I say, ‘Don’t go to a computer; go back to the room, sit down and listen to them. And don’t look like you’re in a hurry.’ ” One reason for this, Dr. Heineken said, is to adjust treatment recommendations based on the patient’s own priorities. “Any difficult clinical decision is made easier after discussing it with the patient,” he said. It is not that he opposes digital technology; Dr. Heineken has been using the Department of Veterans Affairs’ computerized patient record system since it was introduced 15 years ago. Still, his cellphone is an old flip model, and his experience with text messaging is limited. His first appointment one recent day was with Eric Conrad, a 65-year-old Vietnam veteran with severe emphysema. First came a conversation. Dr. Heineken had his patient sit on a chair next to his desk. Despondent, the patient looked down at his battered Reeboks, his breaths shallow and labored. Dr. Heineken has been seeing Mr. Conrad since 1993, and since then, he said, “we’ve been fighting a saw-tooth battle with his weight.”
View the original article here
Read More...

Citing privacy concerns, U.S. panel urges end to secret DNA testing

A DNA double helix is seen in an undated artist's illustration released by the National Human Genome Research Institute to Reuters on May 15, 2012. REUTERS/National Human Genome Research Institute/Handout
They're called discreet DNA samples, and the Elk Grove, California, genetic-testing company easyDNA says it can handle many kinds, from toothpicks to tampons.

Blood stains from bandages and tampons? Ship them in a paper envelope for paternity, ancestry or health testing. EasyDNA also welcomes cigarette butts (two to four), dental floss ("do not touch the floss with your fingers"), razor clippings, gum, toothpicks, licked stamps and used tissues if the more standard cheek swab or tube of saliva isn't obtainable.

If the availability of such services seems like an invitation to mischief or worse - imagine a discarded tissue from a prospective employee being tested to determine whether she's at risk for an expensive disease, for instance - the Presidential Commission for the Study of Bioethical Issues agrees.

On Thursday it released a report on privacy concerns triggered by the advent of whole genome sequencing, determining someone's complete DNA make-up. Although sequencing "holds enormous promise for human health and medicine," commission chairwoman Amy Gutmann told reporters on Wednesday, there is a "potential for misuse of this very personal data."

"In many states someone can pick up your discarded coffee cup and send it for (DNA) testing," said Gutmann, who is the president of the University of Pennsylvania.

"It's not a fantasy to think about how, without baseline privacy protection, people could use this in a way that would be really detrimental," such as by denying someone with a gene that raises their risk of Alzheimer's disease long-term care insurance, or to jack up life insurance premiums for someone with an elevated genetic risk of a deadly cancer that strikes people in middle age.

"Those who are willing to share some of the most intimate information about themselves for the sake of medical progress should be assured appropriate confidentiality, for example, about any discovered genetic variations that link to increased likelihood of certain diseases, such as Alzheimer's, diabetes, heart disease and schizophrenia," Gutmann said.

The commission took on the issue because whole genome sequencing is poised to become part of mainstream medical care, especially by personalizing medical treatments based on a patient's DNA.

$1,000 GENOME

That has been driven in large part by dramatic cost reductions, from $2.5 billion per genome in the Human Genome Project of the 1990s and early 2000s to $1,000 soon. Several companies, including Illumina Inc. and Life Technology's Ion Torrent division, sell machines that can sequence a genome for a few hundred dollars, but that does not include the analysis to figure out what the string of 3 billion DNA "letters" means.

A three-year-old federal law prohibits discrimination in employment or health insurance based on someone's genetic information but does not address other potential misuses of the data. Without such privacy protection, said Gutmann, people may be reluctant to participate in genetic studies that do whole genome sequencing, for fear their genetic data will not be secure and could be used against them.

Recommendations from such panels are not binding but have been used as the basis for policy and legislation.

One scenario the panel offers is a "contentious spouse" secretly having a DNA sample sequenced and using it in a custody battle "as evidence of unfitness to parent," perhaps because the DNA showed a genetic risk for mental illness or alcoholism. There are no federal laws against that.

Or, the panel said, DNA information might be posted in a social networking site "by a malicious stranger or acquaintance," possibly hurting someone's "chance of finding a spouse, achieving standing in a community, or pursuing a desired career path."

The bioethics panel recommends a dozen forms of privacy protection, including that "surreptitious commercial testing" be banned: No gene sequencing or other genetic testing should be permitted without consent from the person the DNA came from, it said. About 25 states currently allow such DNA testing.

Critics of the lack of genetic privacy thought greater urgency was needed.

"The report lays out a lot of important best practices and does endorse further state and federal regulations, but it doesn't offer a timeline," said Jeremy Gruber, president of the Council for Responsible Genetics, a private group that monitors genetic issues. "What will inevitably happen is whole genome sequencing will enter greater use and we won't have proper regulations to insure privacy."

A bill introduced in California, home to many DNA testing companies, by state Senator Alex Padilla would ban surreptitious testing, requiring written authorization from the person the genetic sample was taken from.

It is not clear how many labs are willing to analyze DNA without that authorization. In practice, well-known genetic testing companies such as privately held 23andMe test only saliva samples that are too large to acquire surreptitiously, such as from a drinking glass or licked stamp. "A person would really know that they are spitting into a tube," said 23andMe spokeswoman Jane Rubinstein.

View the original article here
Read More...

Friday, October 12, 2012

Nobel for quantum "parlor trick" that could make super computers

U.S. physicist David Wineland talks about is experiment in his lab during a media tour after a news conference in Boulder, Colorado, after learning he and Serge Haroche of France were awarded the 2012 Nobel Prize in Physics, October 9, 2012. REUTERS/Mark Leffingwell

A French and an American scientist won the Nobel Prize in physics on Tuesday for finding ways to measure quantum particles without destroying them, which could make it possible to build a new kind of computer far more powerful than any seen before.


Serge Haroche of France and American David Wineland, both 68, found ways to manipulate the very smallest particles of matter and light to observe strange behavior that previously could only be imagined in equations and thought experiments.


Wineland once described his own work as a "parlor trick" that performed the seemingly magical feat of putting an object in two places at once. Other scientists praised the achievements as bringing to life the wildest dreams of science fiction.


"The Nobel laureates have opened the door to a new era of experimentation with quantum physics by demonstrating the direct observation of individual quantum particles without destroying them," said the Royal Swedish Academy of Sciences, which awarded them the 8 million crown ($1.2 million) prize.


"Perhaps the quantum computer will change our everyday lives in this century in the same radical way as the classical computer did in the last century."


Haroche said he was walking in the street with his wife when he recognized the Swedish country code on the incoming call to inform him of the award.


"I saw the area code 46, then I sat down," he told reporters in Sweden by telephone. "First I called my children, then I called my closest colleagues, without whom I would never have won this prize," he said. Asked how he would celebrate, he said: "I will have champagne, of course."


He told Reuters he hoped the prize would give him a platform "that will allow me to communicate ideas, not just in this field of research but for research in general, fundamental research".


Wineland was asleep at home in Boulder, Colorado, when the phone call from Stockholm arrived before dawn on Tuesday morning, he said at a press conference. (His wife answered.)


Physics is the second of this year's crop of awards; scientists from Britain and Japan shared the first prize on Monday, in medicine, for adult stem cell research. The prizes, which reward achievements in science, literature and peace, were first awarded in 1901 in accordance with the will of Swedish dynamite millionaire Alfred Nobel.


"This year's Nobel Prize recognizes some of the most incredible experimental tests of the weirder aspects of quantum mechanics," said Jim Al-Khalili, professor of physics at the University of Surrey in Britain.


"Until the last decade or two, some of these results were nothing more than ideas in science fiction or, at best, the wilder imaginations of quantum physicists. Wineland and Haroche and their teams have shown just how strange the quantum world really is and opened up the potential for new technologies undreamt of not so long ago."


INGENIOUS METHODS


Quantum physics studies the behavior of the fundamental building blocks of the universe at a scale smaller than atoms, when tiny particles act in strange ways that can only be described with advanced mathematics.


Researchers have long dreamt of building "quantum computers" that would operate using that mathematics - able to conduct far more complicated calculations and hold vastly more data than classical computers. But they could only be built if the behavior of individual particles could be observed.


"Single particles are not easily isolated from their surrounding environment, and they lose their mysterious quantum properties as soon as they interact with the outside world," the Nobel committee explained.


"Through their ingenious laboratory methods Haroche and Wineland, together with their research groups, have managed to measure and control very fragile quantum states, which were previously thought inaccessible for direct observation. The new methods allow them to examine, control and count the particles."


Both scientists work in the field of quantum optics, studying the fundamental interactions between light and matter. The Nobel committee said they used opposite approaches to the same problem: Wineland uses light particles - or photons - to measure and control particles of matter - electrons - while Haroche uses electrons to control and measure photons.


In one of the strange properties of quantum mechanics, tiny particles act as if they are simultaneously in two locations, based on the likelihood that they would be found at either, known as a "superposition."


It was long thought that it would be impossible to demonstrate this in a lab. But Wineland's "parlor trick" was to hit an atom with laser light, which according to quantum theory had a 50 percent chance of moving it, and observe the atom at two different locations, 80 billionths of a meter apart.


In a normal computer, a switch must either be on or off. A quantum computer would work with switches that, like the particles in Wineland's experiment, behaved as if they were in more than one position at the same time.


An example is a computer trying to work out the shortest route around town for a travelling salesman. A traditional computer might try every possible route and then choose the shortest. A quantum computer could do the calculation in one step, as if the salesman travelled each route simultaneously.


Wineland is a dedicated experimentalist, not bothered by the bizarre philosophical implications of quantum mechanics, such as the notion that reality does not exist until an observer measures it. "You can find debate on this, but I'm not sure we're so special in the universe" as to have the power to bring reality into being, he told Reuters.


His realism extends to applications of his work. "I wouldn't recommend anybody buy stock in a quantum computing company," Wineland told reporters, but he said he was optimistic that it might be possible to build one eventually.


He plans to be part of the quest. Asked if his science career was nearing an end, he said he had no plans to retire "until they drag me out of here for being too old".


View the original article here

Read More...

Bits Blog: From the Land of Angry Birds, a Mobile Game Maker Lifts Off

For a country with a population about the size of Minnesota, Finland has produced some giant global hits in the mobile business, like the phone maker Nokia and Rovio, the company responsible for Angry Birds and Bad Piggies. A Finnish mobile games start-up called Supercell wants its crack at glory too.
The Helsinki-based company calls itself a “tablet first” games company, meaning that it designs its games to take advantage of the larger screen of the tablet rather than just blowing up smartphone games to a bigger display (though it releases versions of its games for smartphones too). For now, the dominance of Apple’s iPad in the tablet market means Supercell is focused mainly on that device.

This year it introduced two games for Apple’s iOS device — a farming game called Hay Day and a strategy game featuring wizards and barbarians called Clash of Clans. Both have done well, but Clash of Clans has been especially successful, occupying the No. 1 slot on Apple’s top-grossing iPad game chart in over five dozen countries for weeks, according to Supercell. The games are free to download and play, but, like FarmVille and a variety of other games, Supercell sells its users in-game currency so they can speed up their game progress and buy virtual goods.
Using this model, Supercell executives say its two games are currently grossing over $500,000 a day, which translates into about $350,000 a day in revenue for Supercell after Apple takes its 30 percent cut on transactions through its iOS App Store.
Besides its country of origin, Supercell shares another similarity, an investor, with Rovio: Accel Partners, the Silicon Valley venture capital firm that was also an early investor in Facebook. The company has raised $15 million in financing from Accel, London Venture Partners and others, $12 million of it from Accel.
In a phone interview, Ilkka Paananen, the founder and chief executive of Supercell, said he did not believe that Accel’s investment in Supercell was connected to the firm’s investment in Rovio. He said, however, that the quality of Rovio’s games had been a big influence on start-ups in the country.
“One thing they’ve really done for the Finnish gaming community is they’ve done a huge favor in raising the bar for everybody,” Mr. Paananen said.
Separately on Monday, Rovio unveiled a plan to keep its Angry Birds franchise steaming forward, with a new game called Angry Birds Star Wars that it is creating in partnership with Lucasfilm.
Supercell has also opened a San Francisco office to be closer to the action in the technology industry, most notably the two big companies it works with most often, Apple and Facebook. Greg Harper, the general manager of Supercell’s North America operations, said the company believed the tablet was “the ultimate game platform.”
“The technology and hardware performance really is close to on-par with that of consoles,” Mr. Harper said.
Supercell’s executives are especially excited by the prospect of a new smaller iPad from Apple, now popularly referred to as the iPad mini. Although he was quick to say that Supercell had no inside knowledge of such a device, Mr. Harper said a smaller, less expensive iPad could help the device reach a broader audience.
Mr. Harper says he believes that the growth in the tablet market will be a bad development for dedicated portable game devices from companies like Nintendo and Sony. “That market seems in trouble to me,” he said. “The iPad mini could be one of the final nails in the coffin.”

View the original article here
Read More...

Thursday, October 11, 2012

Design: Who Made That Escape Key?

Jens Mortensen for The New York Times“It’s the ‘Hey, you! Listen to me’ key,” says Jack Dennerlein of the Harvard School of Public Health. According to Dennerlein, an expert on how humans interact with computers, the escape key helped drive the computer revolution of the 1970s and ’80s. “It says to the computer: ‘Stop what you’re doing. I need to take control.’ ” In other words, it reminds the machine that it has a human master. If the astronauts in “2001: A Space Odyssey” had an ESC key, Dennerlein points out, they could have stopped the rogue computer Hal in an instant. The key was born in 1960, when an I.B.M. programmer named Bob Bemer was trying to solve a Tower of Babel problem: computers from different manufacturers communicated in a variety of codes. Bemer invented the ESC key as way for programmers to switch from one kind of code to another. Later on, when computer codes were standardized (an effort in which Bemer played a leading role), ESC became a kind of “interrupt” button on the PC — a way to poke the computer and say, “Cut it out.” Why “escape”? Bemer could have used another word — say, “interrupt” — but he opted for “ESC,” a tiny monument to his own angst. Bemer was a worrier. In the 1970s, he began warning about the Y2K bug, explaining to Richard Nixon’s advisers the computer disaster that could occur in the year 2000. Today, with our relatively stable computers, few of us need the panic button. But Bob Frankston, a pioneering programmer, says he still uses the ESC key. “There’s something nice about having a get-me-the-hell-out-of-here key.” I, KEYBOARD Joseph Kaye is a senior scientist at Yahoo! Research. Why do outmoded keys, like ESC, persist? Our devices have legacies built into them. For more than a hundred years, when you wanted to write something, you sat down in front of a typewriter. But computers look different now — they’re like smartphones. It will be interesting to see whether in 10 or 15 years the whole idea of a keyboard will seem strange. We might be saying, “Remember when we used to type things?” How would we control computers in this future-without-typing? Think of the Wii and Kinect, or even specialized input devices for games like Guitar Hero or Dance Dance Revolution. All might be bellwethers for the rest of computing. We might see a rise in all sorts of input, like voice recognition and audio control — think about Siri.
This article has been revised to reflect the following correction:
Correction: October 7, 2012
An earlier version misspelled Joseph Kaye’s surname as Kay and misstated his employer. He is a senior scientist at Yahoo! Research not Nokia Research Center.

View the original article here
Read More...

Wal-Mart and American Express Join In Prepaid Card Deal

It is a surprising alliance between the discounter Wal-Mart and American Express, which until recently has been focused on high-end consumers. The move is intended to strengthen both companies’ position in the prepaid card market — which, unlike credit and debit cards, is largely unregulated and has far fewer consumer protections. The account, called Bluebird, will be available next week. The companies are positioning it as an option for people turned off by bank fees. “The only fees consumers will ever pay are clear, transparent and within their control,” such as out-of-network A.T.M. fees, the companies said in a release. Wal-Mart and American Express declined to give details of the financial relationship between the two companies, but indicated both would profit from the card. The fees disclosed by the companies were generally lower than those Wal-Mart now charges for its prepaid MoneyCard. Bluebird means prepaid card holders can have access to features that are usually associated with credit cards, like American Express’s customer service, roadside assistance and mobile banking. But consumer advocates say shoppers should be careful in the largely unregulated world of prepaid cards. The nation’s consumer financial watchdog, the Consumer Financial Protection Bureau, is preparing restrictions on prepaid debit cards. The agency says it has concerns about high fees and inadequate disclosures. Advocacy groups have questioned whether prepaid card issuers clearly explain to cardholders the fees that come with products, including charges to activate the card, load money on it, check a balance at cash machines and speak to customer service. Consumer advocates have said that the cards, which are typically marketed to lower-income customers, have so many fees that they erode money loaded onto the card. Prepaid cards work much like debit cards, except that they are not tied to a traditional, regulated bank account. The cards are part of a larger strategy by lenders to tap into the so-called unbanked or underbanked population — customers who use few, if any, bank services. Such people are considered a $45 billion market, according to the Center for Financial Services Innovation, which provides advisory services. For the Bluebird account, customers can sign up free online or via mobile phone, or pay $5 in a Walmart store. They receive a card stamped with the American Express logo, which they can use anywhere American Express is accepted. They can set up direct deposit for paychecks and deposit other checks by taking a mobile phone picture of them. And they can withdraw cash. The companies do not perform a credit check before creating an account. American Express and Wal-Mart said there would be no minimum balances to maintain, no monthly or annual fees and no overdraft fees (the account does not allow overdrafts, as it does not issue paper checks). It will cost $2 per out-of-network A.T.M. withdrawal, and $2 per withdrawal without direct deposit, but the companies did not disclose other fees as of now. Wal-Mart’s MoneyCard prepaid card costs $3 to buy, $3 a month and $3 to reload. “We know that the model is financially sustainable for both partners,” said Daniel Eckert, vice president of financial services for Wal-Mart U.S. David Robertson, publisher of The Nilson Report, an industry publication for payment systems, said companies in deals like this typically shared the amount charged to merchants when a card was used. He said he expected that Wal-Mart had negotiated a lower merchant-fee rate for card use at a Walmart than competitors would receive. Mr. Robertson said Wal-Mart had most likely realized that its MoneyCard, run by the company Green Dot, was not appealing to all customers. “This market is growing, and it’s moving beyond just that chunk of people that we consider to be underbanked,” he said. “It includes people who might be wanting to buy a prepaid card for other reasons, like budgeting purposes.” Green Dot’s stock declined 20.2 percent on Monday, though Mr. Eckert said that Wal-Mart would continue to offer its MoneyCard. Wal-Mart’s financial services plans were once more ambitious: to get a federal bank charter, meaning it could make loans and get deposits insured by the Federal Deposit Insurance Corporation. But there was opposition from the banking industry and politicians who were worried about small banks. Five years ago, Wal-Mart ceased trying to get a charter, and instead started building services that did not require a charter. Lenders have been clamoring to grab a bigger piece of the booming prepaid card market. In 2009, consumers held roughly $29 billion on prepaid cards, according to the Mercator Advisory Group, a payments industry research group. By the end of 2013, that is expected to swell to $90 billion. A number of the nation’s largest lenders, including JPMorgan Chase, U.S. Bank, Regions Financial and Wells Fargo, are aggressively rolling out prepaid card offerings. One incentive for banks to dive in is that prepaid cards are not restricted by the Dodd-Frank financial regulation law. Thanks to the exemption from Dodd-Frank, banks can charge merchants high fees when a consumer swipes a prepaid card. A recent study by Pew, a nonprofit research group, also indicated that some customers were unaware their prepaid cards were not necessarily protected by the F.D.I.C. Dan Schulman, group president of enterprise growth for American Express, said in a call with reporters that Bluebird was not F.D.I.C.-backed, but that under money-transmittal regulations, American Express was required to hold assets to back up 100 percent of the money in accounts. Prepaid cards have increasingly come under fire from regulators. Last month, the Office of the Comptroller of the Currency brought an action against Urban Trust Bank in Orlando, Fla., which has branches in Walmart stores. The regulator said it discovered “unsafe and unsound banking practices” related to the community bank’s prepaid card offerings.
View the original article here
Read More...

Mind: Recalibrating Therapy for a Wired World - The Digital Doctor

But the virtues of the digital age are not always aligned with those of psychotherapy. It takes time to change behavior and alleviate emotional pain, and for many patients constant access is more harmful than helpful. These days, as never before, therapists are struggling to recalibrate their approach to patients living in a wired world.

For some, the new technology is clearly a boon. Let’s say you have the common anxiety disorder social phobia. You avoid speaking up in class or at work, fearful you’ll embarrass yourself, and the prospect of going to a party inspires dread. You will do anything to avoid social interactions.

You see a therapist who sensibly recommends cognitive-behavioral therapy, which will challenge your dysfunctional thoughts about how people see you and as a result lower your social anxiety. You find that this treatment involves a fair amount of homework: You typically have to keep a written log of your thoughts and feelings to examine them. And since you see your therapist weekly, most of the work is done solo.

As it turns out, there is a smartphone app that will prompt you at various times during the day to record these social interactions and your emotional response to them. You can take the record to your therapist, and you are off and running.

Struggling with major depression? Digital technology may soon have something for you, too. Depressed patients are characteristically lacking in motivation and pleasure; an app easily could lead patients through the day with chores and activities, like having a therapist in one’s pocket. Not just that, but the app might ask you to rate depressive symptoms like sleep, energy, appetite, sex drive and concentration in real time, so that when you next visit your psychiatrist, you can present a more accurate picture of your clinical status without having to worry about your recall.

When it comes to collecting and organizing data, software is hard to beat. But information has a tendency to spread, especially digital information. To wit, electronic medical data containing sensitive personal information can be released, either accidentally or deliberately, and disseminated. Anyone who has followed the hacking of supposedly secure and encrypted financial databases knows this is not a remote possibility.

More worrisome to therapists, perhaps, is that technology also enables access: These days patients reach out via text, e-mail, Facebook, Twitter. For some of them, the easy connectivity that technology makes possible is a decidedly bad idea.

Take a patient who has a fundamental problem in maintaining intimate relationships and who can’t tolerate being alone without feeling bored or anxious — in other words, a patient with typical features of borderline personality disorder. Not surprisingly, such a patient would love instant access to a therapist whenever an uncomfortable feeling arises.

In this case, connectivity would interfere with the central goal of any reasonable treatment, namely acquiring the skills to manage painful feelings by oneself and the ability to tolerate some degree of disappointment. Access-on-demand would mitigate efforts to develop patience and frustration tolerance, and might encourage a sense of entitlement and an illusory notion of power and control.

But perhaps the more difficult challenge is this: By removing barriers to access, digital technology can make therapists more real and knowable to their patients. This cuts both ways.

Recently, a patient I had treated for depression was struggling with the approaching death of his beloved dog. Just divorced, he was dreading another loss. One night while surfing the Internet, he came across a piece I wrote years ago about the death of my own dog.

“So you understand what it’s like,” he said during one of our sessions. This discovery made him feel understood and comforted.

Sometimes, though, digital technology can undermine the clinical rationale for a therapist to maintain distance.

For example, in insight-oriented psychotherapy, which focuses on unconscious processes at the root of personal conflicts, the patient essentially uses his relationship with the therapist to understand how he structures relationships with people in general. The therapist must be free to “become” many different important people in the patient’s life; the more the patient knows of the therapist’s real life, the likelier it is that the treatment will be confounded.

Imagine how you might feel if you had a philandering parent and were having trouble in your own relationships, and you discovered that your own therapist was married and having an affair. It would be hard to believe this would not affect your relationship with your therapist.

Many patients don’t want to know how their therapists feel or the details of their personal lives, and for a good reason: It can undermine the perceived authority of the therapist, making patients feel less secure. And it can inhibit patients from being open for fear of hurting or upsetting their therapists.

I wonder if it’s even possible for therapists to remain anonymous in the age of the Internet, where we can all be found in the electronic cloud. A Google search might not reveal a therapist’s deep, dark secrets, but even basic information begins to alter the relationship.

Last summer, a patient learned that I was swimming in a benefit race in Cape Cod because I’d written something about it that was available online.

“Be careful, Dr. Friedman,” he said with a smile on the way out of my office. “I heard there were sharks out there.” Beneath the humor was anxiety — or perhaps something darker.

Digital technology has the potential to either enhance or confound therapy, but much depends on the patient and the condition being treated. Some patients will find that the glowing screen only feeds their psychopathology. Others will find digital technology a boon to self-esteem and assertiveness. We are only beginning to figure out which patients are which.

Dr. Richard A. Friedman is a professor of psychiatry at Weill Cornell Medical College in Manhattan.


View the original article here

Read More...

Wednesday, October 10, 2012

U.S. Panel Calls Huawei and ZTE ‘National Security Threat’

The House Intelligence Committee said that after a yearlong investigation it had come to the conclusion that the Chinese businesses, Huawei Technologies and ZTE Inc., were a national security threat because of their attempts to extract sensitive information from American companies and their loyalties to the Chinese government. The companies sell telecommunications equipment needed to create and operate wireless networks, like the ones used by Verizon Wireless and AT&T. Many of the major suppliers of the equipment are based outside the United States, creating concerns here about the security of communications. Those concerns are most acute about Huawei and ZTE because of their close ties to the Chinese government, which the committee said has heavily subsidized the companies. Allowing the Chinese companies to do business in the United States, the report said, would give the Chinese government the ability to easily intercept communications and could allow it to start online attacks on critical infrastructure, like dams and power grids. The release of the report comes as both presidential candidates have spoken of the importance of United States ties with China and have promised to act strongly on Chinese currency and trade practices that are damaging to American business interests. Mitt Romney, the Republican presidential candidate, has called repeatedly during his campaign for a more confrontational approach to China on business issues, although he has focused his warnings more on Chinese currency market interventions than on the activities of the nation’s telecommunications companies. President Obama has also taken a tougher stance on China recently. Late last month, Mr. Obama, through the Committee on Foreign Investment, ordered a Chinese company to divest itself of interests in four wind farm projects near a Navy base in Oregon where drone aircraft training takes place. It was the first time a president had blocked such a deal in 22 years. The Obama administration has also filed a case at the World Trade Organization in Geneva accusing China of unfairly subsidizing its exports of autos and auto parts, the ninth trade action the administration has brought against China. “We have a process that is not aimed at one specific company but using all the assets and parts of U.S. government aimed at protecting our telecommunications and critical infrastructure,” a senior White House official said. The report was released on Monday morning at a news conference held by Representative Mike Rogers, Republican of Michigan, the chairman of the House Intelligence Committee, and Representative C. A. Ruppersberger of Maryland, the top Democrat on the committee. They said that the United States government should be barred from doing business with Huawei and ZTE and that American companies should avoid buying their equipment. The report said the committee had obtained internal documents from former employees of Huawei that showed it supplied services to a “cyberwarfare” unit in the People’s Liberation Army. The United States government, the report said, should go through the Committee on Foreign Investment in the United States, an interagency panel that reviews the national security implications of foreign investments, to carry out its recommendations. It also said that committee should block any mergers and acquisitions involving the Chinese companies and American businesses. In the course of the investigation, the House committee said it had uncovered evidence of economic espionage — and officials said on Monday that they planned to hand over the evidence to the F.B.I. Former and current employees for Huawei, the report said, told investigators for the committee that the company had committed “potential violations” in the United States related to immigration, bribery, corruption and copyright infringement. Huawei has been the focus of criticism and security warnings for years, including by the Defense Department. Its expansion plans in the United States have faced resistance from Congress over questions about its ties to the military in China. Huawei denies being financed to undertake research and development for the Chinese military, and its executives have repeatedly insisted that they have nothing to hide. The company issued an open letter to the United States government in February 2011, asking for an inquiry to clear up what it characterized as misperceptions about its history and business operations.
Michael S. Schmidt reported from Washington and Christine Hauser from New York. Keith Bradsher contributed reporting from Hong Kong, and Quentin Hardy from San Francisco.

View the original article here
Read More...

3-D medical scanner: New handheld imaging device to aid doctors on the 'diagnostic front lines'

In the operating room, surgeons can see inside the human body in real time using advanced imaging techniques, but primary care physicians, the people who are on the front lines of diagnosing illnesses, haven't commonly had access to the same technology -- until now. Engineers from the University of Illinois at Urbana-Champaign (UIUC) have created a new imaging tool for primary care physicians: a handheld scanner that would enable them to image all the sites they commonly examine, and more, such as bacterial colonies in the middle ear in 3-D, or monitor the thickness and health of patients' retinas. The device relies on optical coherence tomography (OCT), a visualization technology that is similar to ultrasound imaging, but uses light instead of sound to produce the images.

The team will present their findings at the Optical Society's (OSA) Annual Meeting, Frontiers in Optics (FiO) 2012, taking place Oct. 14 -- 18 in Rochester, N.Y.

To monitor chronic conditions such as ear infections, primary care physicians currently rely on instruments that are essentially magnifying glasses, says UIUC physician and biomedical engineer Stephen Boppart, who will present the team's findings at FiO. The new handheld imaging device would give doctors a way to quantitatively monitor these conditions, and possibly make more efficient and accurate referrals to specialists.

The scanners include three basic components: a near-infrared light source and OCT system, a video camera to relay real-time images of surface features and scan locations, and a microelectromechanical (MEMS)-based scanner to direct the light. Near-infrared wavelengths of light penetrate deeper into human tissues than other wavelengths more readily absorbed by the body. By measuring the time it takes the light to bounce back from tissue microstructure, computer algorithms build a picture of the structure of tissue under examination.

Diabetic patients in particular may benefit from the device. About 40 to 45 percent of diabetics develop leaky blood vessels in their retinas -- a condition called retinopathy, which can lead to thickening of the retina, blurry vision, and eventually blindness. The handheld OCT device would allow doctors to monitor the health of the retina, potentially catching retinopathy in its early stages. In some cases, changes in the eye could help doctors diagnose diabetes, Boppart says.

Boppart and his team are hopeful that falling production costs combined with smaller, more compact designs will enable more physicians to take advantage of the scanners, and become a common point-of-care tool. Eventually, they would like to see the imagers at work in developing countries as well. He and an international team of collaborators recently received a $5 million National Institutes of Health Bioengineering Research Partnership grant to further refine the device.

View the original article here
Read More...

Dick Costolo of Twitter, an Improv Master Writing Its Script

The audience, le beau monde of cinema, has gathered at the Debussy Theater on this unseasonably cool May morning on the French Riviera. The event, officially the opening of the 2012 Cannes Film Festival, will be remembered for freakish storms that left stars shivering on the soaked red carpet. But before the Palme d’Or, a little stand-up comedy from Mr. Costolo, the chief executive of Twitter. He has prepared some sober remarks for the occasion — a paean to the mighty tweet, an explication of how new tools of social media are reinventing business, social activism and everything in between. Nah. Out goes the script. “Since I’ve got 45 minutes, if we can just start with some quick introductions,” he says, gesturing to the front row. “Start over here. Stand up, say what company you’re from and what animal you could be if you could be any animal.” So goes his keynote speech at Cannes. It’s not quite as strange as it sounds. Long before the Twitter revolution and his ascent to the heights of social media, Mr. Costolo was a professional comedian. And you know what? He’s still doing improv — only it’s the business kind. He’ll wax on about growth and revenue like the next C.E.O. But then he’ll dig out a joke and do something that might hurt his business — and miff his investors — because, well, he thinks that something is the right thing to do. He has broken with the pack on the issue of patent infringement, an issue that drives the tech world crazy, and, in stark contrast to Facebook, has let newcomers to the site opt out of being tracked through the service — a daring move, given that Twitter makes money from advertising. Even in Silicon Valley, that Neverland of Mark Zuckerberg and hoodied Lost Boy executives, Mr. Costolo can seem an un-C.E.O. To which he says, essentially, whatever. “People have Plato’s form in their mind of what a leader is, or what a C.E.O. is, and it is a bunch of elements that I really don’t conform to at all,” Mr. Costolo says. “I’ve given this a lot of thought, and I came to the conclusion that I don’t care.” That kind of attitude could take Twitter to heretofore unimaginable success. Or it could turn it into a B-school case study of a start-up company gone wrong. The choice, for the moment, is Mr. Costolo’s. Today, Twitter seems ubiquitous. But this company didn’t even exist seven years ago. Bankrolled by venture capitalists, it has grown into a multibillion-dollar enterprise with 140 million users worldwide. Although the company doesn’t share its financials, it is estimated that it will have $350 million in revenue this year. “We’re an entire quarter ahead of our projected goals,” one executive says. Its next big step is to go public on the stock market, and insiders say the current goal is to have an initial public offering in 2014. Twitter’s social media twin, Facebook, has already gone public, of course — and, so far, Facebook stockholders have lost billions, at least on paper. Facebook’s troubled I.P.O. hangs over the technology industry as a cautionary tale of how investors can become star-struck. Mr. Costolo didn’t found Twitter. Jack Dorsey, Christopher Stone and Evan Williams did. But today Mr. Costolo is essentially running the business alone, and friends and colleagues say he is eager to build the company. And he has succeeded before. During the early 1990s, he worked at Andersen Consulting to subsidize his comedy career. He tried to explain this thing called the World Wide Web to his bosses, but, he says, they didn’t listen. So he and several co-workers started their own consulting firm, Burning Door Networked Media, specializing in Web projects. Mr. Costolo went on to help found and sell three companies. One of them, Spyonit, notified people when a Web site changed. (This was a decade before anyone had heard the term “real time.”) People used Spyonit to monitor auctions on eBay and to see when comment threads were updated on Web forums. Another of his companies, FeedBurner, helped bloggers syndicate content. FeedBurner was sold to Google in 2007 for more than $100 million. But starting a small company and then selling it to a big one, difficult as it can be, seems easy next to taking Twitter to the next level. On paper, Twitter is valued at close to $10 billion. That means the most likely exit strategy for its initial backers — notably Charles River Ventures, Benchmark Capital, Union Square Ventures and Mr. Costolo himself — would be to take the company public. But after the Facebook fiasco, Mr. Costolo will have to persuade Wall Street that Twitter, and its share price, could keep rising. His audience — Wall Street, Silicon Valley and the wider world — is waiting for his next act.
View the original article here
Read More...

Tuesday, October 9, 2012

Acoustic cell-sorting chip may lead to cell phone-sized medical labs

A technique that uses acoustic waves to sort cells on a chip may create miniature medical analytic devices that could make Star Trek's tricorder seem a bit bulky in comparison, according to a team of researchers.

The device uses two beams of acoustic -- or sound -- waves to act as acoustic tweezers and sort a continuous flow of cells on a dime-sized chip, said Tony Jun Huang, associate professor of engineering science and mechanics, Penn State. By changing the frequency of the acoustic waves, researchers can easily alter the paths of the cells.

Huang said that since the device can sort cells into five or more channels, it will allow more cell types to be analyzed simultaneously, which paves the way for smaller, more efficient and less expensive analytic devices.

"Eventually, you could do analysis on a device about the size of a cell phone," said Huang. "It's very doable and we're making in-roads to that right now."

Biological, genetic and medical labs could use the device for various types of analysis, including blood and genetic testing, Huang said.

Most current cell-sorting devices allow the cells to be sorted into only two channels in one step, according to Huang. He said that another drawback of current cell-sorting devices is that cells must be encapsulated into droplets, which complicates further analysis.

"Today, cell sorting is done on bulky and very expensive devices," said Huang. "We want to minimize them so they are portable, inexpensive and can be powered by batteries."

Using sound waves for cell sorting is less likely to damage cells than current techniques, Huang added.

In addition to the inefficiency and the lack of controllability, current methods produce aerosols, gases that require extra safety precautions to handle.

The researchers, who released their findings in the current edition of Lab on a Chip, created the acoustic wave cell-sorting chip using a layer of silicone -- polydimethylsiloxane. According to Huang, two parallel transducers, which convert alternating current into acoustic waves, were placed at the sides of the chip. As the acoustic waves interfere with each other, they form pressure nodes on the chip. As cells cross the chip, they are channeled toward these pressure nodes.

The transducers are tunable, which allows researchers to adjust the frequencies and create pressure nodes on the chip.

The researchers first tested the device by sorting a stream of fluorescent polystyrene beads into three channels. Prior to turning on the transducer, the particles flowed across the chip unimpeded. Once the transducer produced the acoustic waves, the particles were separated into the channels.

Following this experiment, the researchers sorted human white blood cells that were affected by leukemia. The leukemia cells were first focused into the main channel and then separated into five channels.

The device is not limited to five channels, according to Huang.

"We can do more," Huang said. "We could do 10 channels if we want, we just used five because we thought it was impressive enough to show that the concept worked."

Huang worked with Xiaoyun Ding, graduate student, Sz-Chin Steven Lin, postdoctoral research scholar, Michael Ian Lapsley, graduate student, Xiang Guo, undergraduate student, Chung Yu Keith Chan, doctoral student, Sixing Li, doctoral student, all of the Department of Engineering Science and Mechanics at Penn State; Lin Wang, Ascent BioNano Technologies; and J. Philip McCoy, National Heart, Lung and Blood Institute, National Institutes of Health.

The National Institutes of Health Director's New Innovator Award, the National Science Foundation, Graduate Research Fellowship and the Penn State Center for Nanoscale Science supported this work.

View the original article here
Read More...

Campaigns Use Social Media to Lure In Younger Voters

If the presidential campaigns of 2008 were dipping a toe into social media like Facebook and Twitter, their 2012 versions are well into the deep end. They are taking to fields of online battle that might seem obscure to the non-Internet-obsessed — sharing song playlists on Spotify, adding frosted pumpkin bread recipes to Pinterest and posting the candidates’ moments at home with the children on Instagram. At stake, the campaigns say they believe, are votes from citizens, particularly younger ones, who may not watch television or read the paper but spend plenty of time on the social Web. The campaigns want to inject themselves into the conversation on services like Tumblr, where political dialogue often takes the form of remixed photos and quirky videos. To remind Tumblr users about the first presidential debate on Wednesday, Mr. Obama’s team used an obscure clip of Lindsay Lohan saying “It’s October 3” in the comedy “Mean Girls.” And on Twitter, Mitt Romney’s bodyguard posted a picture of the candidate’s family playing Jenga before the debate. The techniques may be relatively new, but they are based on some old-fashioned political principles, according to Zachary Moffatt, the digital director for the Romney campaign. “The more people you talk to, the more likely you are to win,” said Mr. Moffatt, who oversees about 120 staff members and volunteers. “The more people who interact with Mitt, the more likely he is to win. Social extends and amplifies that.” But as is the way of the Web, a well-intended post or picture on social networks can quickly morph into a disaster. And the slightest gaffe on the campaign trail can become a “Groundhog Day” moment, repeated endlessly. “Even a typo is a big deal,” Mr. Moffatt said. In July, when Mr. Obama told a crowd of supporters “You didn’t build that” while talking about the importance of public infrastructure, the Romney campaign pounced, uploading photos of hot-dog-joint owners and others displaying signs with variations on the slogan “I built this.” And Clint Eastwood’s mock interview with the president at the Republican convention sent the Web into a frenzy. Within minutes, images of Mr. Eastwood on stage, plastered with cutting captions, hit Tumblr, and Twitter was flooded with parodies. Mr. Obama’s team joined in, sharing on Twitter a photo of him in a chair marked “The President,” with the caption, “This seat’s taken.” That retort is in line with the overall social media presence of the Obama campaign, which tends to be sharper and more attitude-laden than the Republican efforts, particularly on Tumblr. The morning after the debate, the Obama Tumblr followed up on Mr. Romney’s reference to cutting financing for PBS by posting something that was circulating on Twitter: a picture of Big Bird from Sesame Street with the caption “Mitt Romney’s Plan to Cut the Deficit: Fire This Guy.” (Laura Olin, who previously worked at a digital strategy agency, helps lend a savvy tone to the campaign’s Tumblr efforts.) Both camps tend to rely heavily on photos, slogans and the like that have been generated by their supporters. The Obama team, in particular, is fond of posting GIFs, or short looping video clips, that have been made by others. These might show the president high-fiving children or hugging his wife and daughters. Other clips poke fun at rivals or give knowing nods to hip television shows like “Parks and Recreation.” At times the campaign’s freer-wheeling tone can get it into trouble: an image it shared on Tumblr that urged followers to “vote like your lady parts depend on it” drew criticism from conservative bloggers and others who thought it was in poor taste. The campaign quickly took down the image, saying it had not been properly vetted. Those who keep up with the Obama campaign on Tumblr seem to approve of the approach — with some posts attracting close to 70,000 “notes,” or likes and reposts from users. “It’s about authentic, two-way communication,” said Adam Fetcher, deputy press secretary for the Obama campaign. “Social media is a natural extension of our massive grass-roots organization.” By comparison, the Romney campaign’s presence on Tumblr is more subdued, sticking largely to posterlike photos with slogans like “No, we can’t.” Its posts rarely get more than 400 responses. Both campaigns have teams of Internet-adept staff members who try to coordinate their strategy and message across many social sites. They declined to specify how this works, saying they did not want to tip off the competition. But both rely heavily on Facebook and Twitter to solicit donations, blast out reminders of events and share articles and videos conveying their stances. Flickr and Instagram serve as scrapbooks from the campaign trail, showing the candidates trying the pie at small-town restaurants. On Tumblr and Pinterest, the campaigns often highlight photos and other material from supporters. As important as the campaigns say these efforts are, the candidates themselves are not actually doing the posting. But sometimes their wives are. While Mr. Romney has a campaign-run Pinterest board, his wife, Ann, has her own, showcasing her favorite crafts projects and books. When Michelle Obama posts a message on Twitter or shares an image on the campaign’s Pinterest board, her posts bear her initials — “mo” — so they stand out among those generated by campaign staff. Twitter and Facebook are still the biggest avenues for online canvassing, with their broad demographic reach and user numbers that have grown tenfold from four years ago. It may be hard to fathom what posting video clips or music playlists on less mainstream sites has to do with the election. Does it really matter to voters if Mr. Obama has Stevie Wonder on his list, while Mr. Romney prefers Johnny Cash? Though the returns on such efforts are not easily quantifiable, neither party is taking any chances. “What’s the return on putting your pants on in the morning? We don’t know,” said Jan Rezab, the chief executive of Socialbakers, a social media analytics firm. “But we just know it’s bad if you don’t do it.” Coye Cheshire, an associate professor at the School of Information at the University of California at Berkeley, pointed to another motivation for such seemingly trivial online updates. “It is important for people to know whether or not a huge political figure shares the same taste as me,” said Dr. Cheshire, who studies behavior and trust online. “And creating a playlist on Spotify is part of what makes them seem more human.”
View the original article here
Read More...

Monday, October 8, 2012

Art.sy Is Mapping the World of Art on the Web

But until now, there was no automated guidance for art lovers seeking discoveries online — no “If you like Jackson Pollock’s ‘No. 1,’ you may also enjoy Mark Rothko’s ‘No. 18.’ ”

Enter Art.sy, a start-up whose public version went live on Monday. An extensive free repository of fine-art images and an online art appreciation guide, it is predicated on the idea that audiences comfortable with image-driven Web sites like Tumblr and Pinterest are now primed to spend hours browsing through canvases and sculpture on their monitors and tablets, especially with one-click help.

After two years of private testing and with millions of dollars from investors, including some celebrities in the art and technology worlds, the site aims to do for visual art what Pandora did for music and Netflix for film: become a source of discovery, pleasure and education.

With 275 galleries and 50 museums and institutions as partners, Art.sy has already digitized 20,000 images into its reference system, which it calls the Art Genome Project. But as it extends the platform’s reach, Art.sy also raises questions about how (or if) digital analytics should be applied to visual art. Can algorithms help explain art?

Robert Storr, dean of the Yale University School of Art, has his doubts. “It depends so much on the information, who’s doing the selection, what the criteria are, and what the cultural assumptions behind those criteria are,” Mr. Storr, a former curator of painting and sculpture at the Museum of Modern Art, said. In terms of art comprehension, he added, “I’m sure it will be reductive.”

The technology, at least, is expansive. To make suggestions successfully, computers must be taught expert human judgment, a process that starts with labeling: give a machine codes to tell the difference between a Renaissance portrait and a Modernist drip painting, say, and then it can sort through endless works, making comparisons and drawing connections.

For the Art Genome Project, Matthew Israel, 34, who holds a Ph.D. in art and archaeology from the Institute of Fine Arts at New York University, leads a team of a dozen art historians who decide what those codes are and how they should be applied. Some labels (Art.sy calls them “genes” and recognizes about 800 of them, with more added daily) denote fairly objective qualities, like the historical period and region the work comes from and whether it is figurative or abstract, or belongs in an established category like Cubism, Flemish portraiture or photography.

Other labels are highly subjective, even quirky; for contemporary art, for example, Art.sy’s curators might attach terms like “globalization” and “culture critique” to give ideological context. “Contemporary traces of memory” is an elastic theme assigned to pieces by the Chinese Conceptual artist Cai Guo-Qiang and the photographer and filmmaker Matt Saunders.

A Picasso might be tagged with “Cubism,” “abstract painting,” “Spain,” “France” and “love,” all terms that are visible and searchable on the site. Jackson Pollock’s works typically get “abstract art,” “New York School,” “splattered/dripped,” “repetition” and “process-oriented.” Predictably, some of those criteria show up on paintings by Pollock’s contemporaries Robert Motherwell and Willem de Kooning, but also on artists from different eras and styles, like Tara Donovan, whose contemporary abstract sculptures using stacked and layered plastic foam and paper plates have also been marked with “repetition.”

As the categories are applied, each is assigned a value between 1 and 100: an Andy Warhol might rate high on the Pop Art scale, while a post-Warholian could rank differently, depending on influences. Software can help filter images for basic visual qualities like color, but the soul of the judgment is human.

“Literally, a person goes in by hand, and they enter a number for all the relevant fields,” Mr. Israel said.

The technical complexity is outweighed by the curatorial challenges. “We learned that the data matters much more than the math,” said Daniel Doubrovkine, 35, who is in charge of engineering at Art.sy. “How are you going to pick something that shows ‘warmth’ with a machine? We’re not.”

Similarly, Pandora has a roomful of musicologists deconstructing each tune; their analysis is then fed into an algorithm, called the Music Genome Project, that recommends songs in its player based on users’ taste and the ratings they give each track. (Joe Kennedy, the chief executive of Pandora, served as a consultant to Art.sy.)

But Art.sy aims to make connections among artworks that are seemingly from different worlds, with a catalog that encompasses pieces from the British Museum, the National Gallery in Washington, the Los Angeles Museum of Contemporary Art and others. A recent partner, the Cooper-Hewitt, National Design Museum in Manhattan, a branch of the Smithsonian, has added objects to the mix, which will be a test of the site’s technology and the parallels it draws, said Seb Chan, the Cooper-Hewitt’s director of digital and emerging media.

Culturally, “what does it mean to recommend a painting from seeing a seventh-century spoon, for example?” he said. Anticipating such questions, the Art.sy staff has a blog explaining how its process works.


View the original article here

Read More...

Sea urchin's spiny strength revealed

For the first time, a team of Australian engineers has modelled the microscopic mechanics of a sea urchin's spine, gaining insight into how these unusual creatures withstand impacts in their aquatic environment.

The skeleton of the purple-spined sea urchin (Centrostephanus rodgersii
), found in tidal waters along the coast of New South Wales, has many long spines extending from its core. These spiky features are used for walking, sensing their environment, and for protection against predators and rough surf.
The long hollow spines are made from a single crystal of calcite -- which is essentially glass -- arranged in a porous, intricate structure. Material scientists are interested in the chemical composition of these spines, but there has been no exploration of how they respond to mechanical stress.

In this latest study, published in the open-access journal PLOS ONE, researchers from UNSW and the Australian National University explain how this unique and intricate structure enables an advantageous blend of elasticity and brittleness, which allows the spine to better absorb impacts and stress under some conditions, and snap off under others.

In addition to offering some new insight into this curious marine animal, researchers say the finding could offer clues for creating new bio-inspired materials and more efficient engineering designs, which often strive to improve strength-to-weight efficiency.

The sea urchin's spine strength is particularly interesting given the brittleness of its constituent materials, says lead author Dr Naomi Tsafnat from the School of Mechanical and Manufacturing Engineering at UNSW.

"While we're not certain that this evolutionary feature is optimised, it certainly works -- the longevity of this creature, having survived hundreds of millions of years, is a testament to that," she says. "The spine is both strong and lightweight, and has mechanical characteristics that suit the sea urchin's needs."

"It can withstand some types of loads, like compression, which allows the sea urchin to manoeuver and walk around, but snaps easily when the urchin needs to protect itself from predators."

Using a process known as microtomography, the researchers created a high-resolution 3D microscopic image of a segment of spine, which allowed them to identify unique features in its architecture. These included protruding wedges and barbs on their surface, linked together by tiny bridging structures that spiral around the spine's axis.

The team used this 3D image to create a computer model of the spine segment, and then simulated various mechanical load scenarios. They observed that different types of stress concentrate at different points within the architecture. This contributes greatly to its strength and unusual elasticity under certain strains.

View the original article here
Read More...

Engineers invent new device that could increase Internet download speeds

A team of scientists and engineers at the University of Minnesota has invented a unique microscale optical device that could greatly increase the speed of downloading information online and reduce the cost of Internet transmission.

The device uses the force generated by light to flop a mechanical switch of light on and off at a very high speed. This development could lead to advances in computation and signal processing using light instead of electrical current with higher performance and lower power consumption.

The research results were published October 2 in the online journal Nature Communications.

"This device is similar to electromechanical relays but operates completely with light," said Mo Li, an assistant professor of electrical and computer engineering in the University of Minnesota's College of Science and Engineering.

The new study is based on a previous discovery by Li and collaborators in 2008 where they found that nanoscale light conduits can be used to generate a strong enough optical force with light to mechanically move the optical waveguide (channel of information that carries light). In the new device, the researchers found that this force of light is so strong that the mechanical property of the device can be dominated completely by the optical effect rather than its own mechanical structure. The effect is amplified to control additional colored light signals at a much higher power level.

"This is the first time that this novel optomechanical effect is used to amplify optical signals without converting them into electrical ones," Li said.

Glass optical fibers carry many communication channels using different colors of light assigned to different channels. In optical cables, these different-colored light channels do not interfere with each other. This non-interference characteristic ensures the efficiency of a single optical fiber to transmit more information over very long distances. But this advantage also harbors a disadvantage. When considering computation and signal processing, optical devices could not allow the various channels of information to control each other easily…until now.

The researchers' new device has two optical waveguides, each carrying an optical signal. Placed between the waveguides is an optical resonator in the shape of a microscale donut (like a mini-Hadron collider.) In the optical resonator, light can circulate hundreds of times gaining intensity.

Using this resonance effect, the optical signal in the first waveguide is significantly enhanced in the resonator and generates a very strong optical force on the second waveguide. The second waveguide is released from the supporting material so that it moves in oscillation, like a tuning fork, when the force is applied on it. This mechanical motion of the waveguide alters the transmission of the optical signal. Because the power of the second optical signal can be many times higher than the control signal, the device functions like a mechanical relay to amplify the input signal.

Currently, the new optical relay device operates one million times per second. Researchers expect to improve it to several billion times per second. The mechanical motion of the current device is sufficiently fast to connect radio-frequency devices directly with fiber optics for broadband communication.

Li's team at University of Minnesota includes graduate students Huan Li, Yu Chen and Semere Tadesse and former postdoctoral fellow Jong Noh. Funding support of the project came from the University of Minnesota College of Science and Engineering and the Air Force Office of Scientific Research.

View the original article here
Read More...

Graphene nanopores can be controlled: Less costly ways of sequencing DNA

Engineers at The University of Texas at Dallas have used advanced techniques to make the material graphene small enough to read DNA.

Shrinking the size of a graphene pore to less than one nanometer -- small enough to thread a DNA strand -- opens the possibility of using graphene as a low-cost tool to sequence DNA.

"Sequencing DNA at a very cheap cost would enable scientists and doctors to better predict and diagnose disease, and also tailor a drug to an individual's genetic code," said Dr. Moon Kim, professor of materials science and engineering. He was senior author of an article depicted on the cover of the September print edition of Carbon
.
The first reading, or sequencing, of human DNA by the international scientific research group known as the Human Genome Project cost about $2.7 billion. Engineers have been researching alternative nanomaterials materials that can thread DNA strands to reduce the cost to less than $1,000 per person.

Dr. Moon Kim, professor of materials science and engineering, was the senior author of the article.

It was demonstrated in 2004 that graphite could be changed into a sheet of bonded carbon atoms called graphene, which is believed to be the strongest material ever measured. Because graphene is thin and strong, researchers have searched for ways to control its pore size. They have not had much success. A nanoscale sensor made of graphene could be integrated with existing silicon-based electronics that are very advanced and yet cheap, to reduce costs.

In this study, Kim and his team manipulated the size of the nanopore by using an electron beam from an advanced electron microscope and in-situ
heating up to 1200 degree Celsius temperature.
"This is the first time that the size of the graphene nanopore has been controlled, especially shrinking it," said Kim. "We used high temperature heating and electron beam simultaneously, one technique without the other doesn't work."

Now that researchers know the pore size can be controlled, the next step in their research will be to build a prototype device.

"If we could sequence DNA cheaply, the possibilities for disease prevention, diagnosis and treatment would be limitless," Kim said. "Controlling graphene puts us one step closer to making this happen."

Other UT Dallas researchers from the Erik Jonsson School of Engineering and Computer Science involved in this project are Dr. Ning Lu, research scientist in materials science and engineering; Dr. Jinguo Wang, associate EM Facility Director; and Dr. Herman Carlo Floresca, postdoctoral research fellow in materials science and engineering.

The study was funded by the Southwest Academy of Nanoelectronics, Air Force Office of Scientific Research and the World Class University Program.

View the original article here
Read More...

Case of missing quasar gas clouds now solved

The case of the missing quasar gas clouds has been solved by a worldwide research team led by Penn State astronomers Nurten Filiz Ak and Niel Brandt. The discovery was announced Oct. 1 in a paper published in The Astrophysical Journal, which describes 19 distant quasars whose giant clouds of gas seem to have disappeared in just a few years.


"We know that many quasars have structures of fast-moving gas caught up in 'quasar winds,' and now we know that those structures can regularly disappear from view," said Filiz Ak, a graduate student in the Department of Astronomy and Astrophysics at Penn State and lead author of the paper. "But why is this happening?"


Quasars are powered by gas falling into supermassive black holes at the centers of galaxies. As the gas falls into the black hole, it heats up and gives off light. The gravitational force from the black hole is so strong, and is pulling so much gas, that the hot gas glows brighter than the entire surrounding galaxy. But with so much going on in such a small space, some of the gas is not able to find its way into the black hole. Much of it instead escapes, carried along by strong winds blowing out from the center of the quasar.


"These winds blow at thousands of miles per second, far faster than any winds we see on Earth," said Niel Brandt, a Distinguished Professor of Astronomy and Astrophysics at Penn State and Filiz Ak's doctoral adviser. "The winds are important because we know that they play an important role in regulating the quasar's central black hole, as well as star formation in the surrounding galaxy."


Many quasars show evidence of these winds in their spectra -- measurements of the amount of light that the quasar gives off at different wavelengths. Just outside the center of the quasar are clouds of hot gas flowing away from the central black hole. As light from deeper in the quasar passes through these clouds on its way to Earth, some of the light gets absorbed at particular wavelengths corresponding to the elements in the clouds.


As gas clouds are accelerated to high speeds by the quasar, the Doppler effect spreads the absorption over a broad range of wavelengths, leading to a wide valley visible in the spectrum. The width of this "broad absorption line" (BAL) measures the speed of the quasar's wind. Quasars whose spectra show such broad absorption lines are known as "BAL quasars."


But the hearts of quasars are chaotic, messy places. Quasar winds blow at thousands of miles per second, and the disk around the central black hole is rotating at speeds that approach the speed of light. All this action adds up to an environment that can change quickly.


Previous studies had found a few examples of quasars whose broad absorption lines seemed to have disappeared between one observation and the next. But these quasars had been found one at a time, and largely by chance -- no one had ever done a systematic search for them until 1998, when the Sloan Digital Sky Survey (SDSS) undertook the challenge, in 1998, of regularly measuring the spectra of hundreds of quasars during an effort spanning several years.


Over the past three years, as part of SDSS-III's Baryon Oscillation Spectroscopic Survey (BOSS), the researchers specifically have been seeking out repeated spectra of BAL quasars through a program proposed by Brandt and his colleagues.


Their persistence paid off -- the research team gathered a sample of 582 BAL quasars, each of which had repeat observations over a period of between one and nine years -- a sample about 20 times larger than any that previously had been assembled. The team then began to search for changes, and found that, in 19 of the quasars, the broad absorption lines had disappeared.


There are several possible explanations for the disappearance of the gas clouds, but the simplest is that, in these quasars, gas clouds that previously had been detected are now "gone with the wind" -- blown out of the line-of-sight between us and the quasar by the rotation of the quasar's disk and its wind. Because the sample of quasars is so large, and had been gathered in such a systematic manner, the team is able to go beyond simply identifying disappearing gas clouds. "We can quantify this phenomenon," Ak said.


Finding 19 such quasars out of 582 total indicates that about three percent of quasars show disappearing gas clouds over a three-year span, which in turn suggests that a typical quasar cloud spends about a century along our line of sight. "It is fascinating to be able to document these relatively rapid changes that actually occurred billions of years ago, at a time before the Sun was formed," remarked team member Donald Schneider, distinguished Professor of Astronomy and Astrophysics at Penn State and the SDSS-III Survey Coordinator.


Now, as other astronomers come up with models of quasar winds, their models will need to explain this 100-year timescale. As theorists begin to consider the results, and the team continues to analyze its sample of quasars, more results are expected soon. "This research is really exciting for me," Filiz Ak said. "I'm sitting at my desk, discovering the nature of the most powerful winds in the Universe."


View the original article here

Read More...

Sunday, October 7, 2012

The Helix Nebula: Bigger in death than life

A dying star is refusing to go quietly into the night, as seen in a combined infrared and ultraviolet view from NASA's Spitzer Space Telescope and the Galaxy Evolution Explorer (GALEX), which NASA has lent to the California Institute of Technology in Pasadena. In death, the star's dusty outer layers are unraveling into space, glowing from the intense ultraviolet radiation being pumped out by the hot stellar core.

This object, called the Helix nebula, lies 650 light-years away in the constellation of Aquarius. Also known by the catalog number NGC 7293, it is a typical example of a class of objects called planetary nebulae. Discovered in the 18th century, these cosmic works of art were erroneously named for their resemblance to gas-giant planets.

Planetary nebulae are actually the remains of stars that once looked a lot like our sun. These stars spend most of their lives turning hydrogen into helium in massive runaway nuclear fusion reactions in their cores. In fact, this process of fusion provides all the light and heat that we get from our sun. Our sun will blossom into a planetary nebula when it dies in about five billion years.

When the hydrogen fuel for the fusion reaction runs out, the star turns to helium for a fuel source, burning it into an even heavier mix of carbon, nitrogen and oxygen. Eventually, the helium will also be exhausted, and the star dies, puffing off its outer gaseous layers and leaving behind the tiny, hot, dense core, called a white dwarf. The white dwarf is about the size of Earth, but has a mass very close to that of the original star; in fact, a teaspoon of a white dwarf would weigh as much as a few elephants!

The intense ultraviolet radiation from the white dwarf heats up the expelled layers of gas, which shine brightly in the infrared. GALEX has picked out the ultraviolet light pouring out of this system, shown throughout the nebula in blue, while Spitzer has snagged the detailed infrared signature of the dust and gas in red, yellow and green. Where red Spitzer and blue GALEX data combine in the middle, the nebula appears pink. A portion of the extended field beyond the nebula, which was not observed by Spitzer, is from NASA's all-sky Wide-field Infrared Survey Explorer (WISE). The white dwarf star itself is a tiny white pinprick right at the center of the nebula.

View the original article here
Read More...

Ordered atoms in glass materials discovered

Scientists at the U.S. Department of Energy's (DOE) Ames Laboratory have discovered the underlying order in metallic glasses, which may hold the key to the ability to create new high-tech alloys with specific properties.


Glass materials may have a far less randomly arranged structure than formerly thought.


Over the years, the ideas of how metallic glasses form have been evolving, from just a random packing, to very small ordered clusters, to realizing that longer range chemical and topological order exists.


But by studying the structure of a metallic glass alloy formed at varying cooling rates, Matthew Kramer (in the photo at left) and his team of fellow scientists at the Ames Laboratory have been able to show there is some organization to these structures. These findings were recently published in Scientific Reports, and in a second paper in Physical Review Letters with Paul Voyles' team from University of Madison, Wisc.


"This has been one of those burning questions in material science for a while, how to describe these disordered systems. Our studies are showing this underlying structure. It's diffuse, but it's there. It's been suspected for a long time and even the general structures have been postulated, but to what degree and how to quantify them, that has been the trick," Kramer said.


Kramer's team of scientists used melt spinning to form the alloy samples, a technique that supercools liquids by ejecting it in a stream onto a rapidly spinning copper wheel. These high cooling rates allows the liquid to form a non-crystalline alloy, or metallic glass.


"It's not only the chemistry of that metal that's important. In many cases, how you get it to the solid state, how it solidifies is also a critical factor," said Kramer.


Data was gathered using a high energy X-ray beam at the Advanced Photon Source at Argonne National Laboratory, atom probe chemical analysis, and computational modeling.


The researchers found there are local configurations of the atoms that tend toward a more ordered structure compared to looking at the whole structure. Kramer compared it to design elements in a complex wallpaper style.


"You'll see a little individual design in that wallpaper, and it has a bit of intricacy. That smaller, complex design, you'll see it repeated throughout the wallpaper. In crystallography we call that a motif. A crystalline solid has those motifs in a very ordered array. In the liquid structure, these motifs are still present, but are shuffled around a bit. They're not marching in rows anymore."


Kramer said in liquids these motifs, while not well organized in repeating patterns like crystalline structures, do tend to fall into discrete distances from each other within a certain range.


Not only that, they begin to organize themselves into interconnected networks, similar to the polymeric chains seen in silicate glass and polymers.


"It's these interconnected networks and the degree to which they develop, which probably controls the ability to go from a liquid state to a glassy state with a metal," said Kramer.


Understanding exactly how these metallic glasses form is the key to being able to manipulate their structure for development of new alloys.


"Developing new materials has largely been an Edisonian process. People guess at some interesting alloy compositions, they do some sort of casting, and they look and see what they get. We're trying to get at the challenge in looking for new materials in a different way," Kramer explained. "What might the arrangement of atoms need to be in order to provide the properties we want? Can you actually in fact create these novel structures? By understanding these fundamental building blocks and arranging them in new ways, can we create materials with new or different properties? These are the questions we want to answer."


The research is supported by the U.S. Department of Energy Office of Science through the Ames Laboratory.


View the original article here

Read More...

Saturday, October 6, 2012

A curious cold layer in the atmosphere of Venus

Venus Express has spied a surprisingly cold region high in the planet's atmosphere that may be frigid enough for carbon dioxide to freeze out as ice or snow.


The planet Venus is well known for its thick, carbon dioxide atmosphere and oven-hot surface, and as a result is often portrayed as Earth's inhospitable evil twin.


But in a new analysis based on five years of observations using ESA's Venus Express, scientists have uncovered a very chilly layer at temperatures of around -175ºC in the atmosphere 125 km above the planet's surface.


The curious cold layer is far frostier than any part of Earth's atmosphere, for example, despite Venus being much closer to the Sun.


The discovery was made by watching as light from the Sun filtered through the atmosphere to reveal the concentration of carbon dioxide gas molecules at various altitudes along the terminator -- the dividing line between the day and night sides of the planet.


Armed with information about the concentration of carbon dioxide and combined with data on atmospheric pressure at each height, scientists could then calculate the corresponding temperatures.


"Since the temperature at some heights dips below the freezing temperature of carbon dioxide, we suspect that carbon dioxide ice might form there," says Arnaud Mahieux of the Belgian Institute for Space Aeronomy and lead author of the paper reporting the results in the Journal of Geophysical Research.


Clouds of small carbon dioxide ice or snow particles should be very reflective, perhaps leading to brighter than normal sunlight layers in the atmosphere.


"However, although Venus Express indeed occasionally observes very bright regions in the Venusian atmosphere that could be explained by ice, they could also be caused by other atmospheric disturbances, so we need to be cautious," says Dr Mahieux.


The study also found that the cold layer at the terminator is sandwiched between two comparatively warmer layers.


"The temperature profiles on the hot dayside and cool night side at altitudes above 120 km are extremely different, so at the terminator we are in a regime of transition with effects coming from both sides.


"The night side may be playing a greater role at one given altitude and the dayside might be playing a larger role at other altitudes."


Similar temperature profiles along the terminator have been derived from other Venus Express datasets, including measurements taken during the transit of Venus earlier this year.


Models are able to predict the observed profiles, but further confirmation will be provided by examining the role played by other atmospheric species, such as carbon monoxide, nitrogen and oxygen, which are more dominant than carbon dioxide at high altitudes.


"The finding is very new and we still need to think about and understand what the implications will be," says HÃ¥kan Svedhem, ESA's Venus Express project scientist.


"But it is special, as we do not see a similar temperature profile along the terminator in the atmospheres of Earth or Mars, which have different chemical compositions and temperature conditions."


View the original article here

Read More...

NASA's infrared observatory measures expansion of universe

Astronomers using NASA's Spitzer Space Telescope have announced the most precise measurement yet of the Hubble constant, or the rate at which our universe is stretching apart.


The Hubble constant is named after the astronomer Edwin P. Hubble, who astonished the world in the 1920s by confirming our universe has been expanding since it exploded into being 13.7 billion years ago. In the late 1990s, astronomers discovered the expansion is accelerating, or speeding up over time. Determining the expansion rate is critical for understanding the age and size of the universe.


Unlike NASA's Hubble Space Telescope, which views the cosmos in visible light, Spitzer took advantage of long-wavelength infrared light to make its new measurement. It improves by a factor of 3 on a similar, seminal study from the Hubble telescope and brings the uncertainty down to 3 percent, a giant leap in accuracy for cosmological measurements. The newly refined value for the Hubble constant is 74.3 plus or minus 2.1 kilometers per second per megaparsec. A megaparsec is roughly 3 million light-years.


"Spitzer is yet again doing science beyond what it was designed to do," said project scientist Michael Werner at NASA's Jet Propulsion Laboratory in Pasadena, Calif. Werner has worked on the mission since its early concept phase more than 30 years ago. "First, Spitzer surprised us with its pioneering ability to study exoplanet atmospheres," said Werner, "and now, in the mission's later years, it has become a valuable cosmology tool."


In addition, the findings were combined with published data from NASA's Wilkinson Microwave Anisotropy Probe to obtain an independent measurement of dark energy, one of the greatest mysteries of our cosmos. Dark energy is thought to be winning a battle against gravity, pulling the fabric of the universe apart. Research based on this acceleration garnered researchers the 2011 Nobel Prize in physics.


"This is a huge puzzle," said the lead author of the new study, Wendy Freedman of the Observatories of the Carnegie Institution for Science in Pasadena. "It's exciting that we were able to use Spitzer to tackle fundamental problems in cosmology: the precise rate at which the universe is expanding at the current time, as well as measuring the amount of dark energy in the universe from another angle." Freedman led the groundbreaking Hubble Space Telescope study that earlier had measured the Hubble constant.


Glenn Wahlgren, Spitzer program scientist at NASA Headquarters in Washington, said infrared vision, which sees through dust to provide better views of variable stars called cepheids, enabled Spitzer to improve on past measurements of the Hubble constant.


"These pulsating stars are vital rungs in what astronomers call the cosmic distance ladder: a set of objects with known distances that, when combined with the speeds at which the objects are moving away from us, reveal the expansion rate of the universe," said Wahlgren.


Cepheids are crucial to the calculations because their distances from Earth can be measured readily. In 1908, Henrietta Leavitt discovered these stars pulse at a rate directly related to their intrinsic brightness.


To visualize why this is important, imagine someone walking away from you while carrying a candle. The farther the candle traveled, the more it would dim. Its apparent brightness would reveal the distance. The same principle applies to cepheids, standard candles in our cosmos. By measuring how bright they appear on the sky, and comparing this to their known brightness as if they were close up, astronomers can calculate their distance from Earth.


Spitzer observed 10 cepheids in our own Milky Way galaxy and 80 in a nearby neighboring galaxy called the Large Magellanic Cloud. Without the cosmic dust blocking their view, the Spitzer research team was able to obtain more precise measurements of the stars' apparent brightness, and thus their distances. These data opened the way for a new and improved estimate of our universe's expansion rate.


"Just over a decade ago, using the words 'precision' and 'cosmology' in the same sentence was not possible, and the size and age of the universe was not known to better than a factor of two," said Freedman. "Now we are talking about accuracies of a few percent. It is quite extraordinary."


The study appears in the Astrophysical Journal. Freedman's co-authors are Barry Madore, Victoria Scowcroft, Chris Burns, Andy Monson, S. Eric Person and Mark Seibert of the Observatories of the Carnegie Institution and Jane Rigby of NASA's Goddard Space Flight Center in Greenbelt, Md.


NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA. For more information about Spitzer, visit http://spitzer.caltech.edu and http://www.nasa.gov/spitzer .


View the original article here

Read More...