Archive for January, 2011
After spending more than a decade reporting on the Centers for Disease Control and Prevention, Maryn McKenna knows plenty of ways we could all die terrible deaths, compliments of nature’s craftiest single-celled organisms. Her coverage of anthrax, polio, bird flu and MRSA eventually earned her the nickname “Scary Disease Girl.”
McKenna channeled that experience into two books, the most recent of which is Superbug: The Fatal Menace of MRSA. It chronicles the emergence of Methicillin-resistant Staphylococcus aureus in the modern world and how it became an epidemic. She also blogs for Wired Science’s Superbug.
As a journalist who specializes in the terrifying, McKenna said she’s always careful to balance the appalling details with the empowering facts to educate people about their risks and how to protect themselves.
“People like to be scared, in sort of the same way they like to go to horror movies,” McKenna told me when I caught up with her at ScienceOnline 2011 in Durham, N.C. “On the one hand, I can rely on there being a consistent audience for tales of diseases that sneak up on us and things that make your face melt, things that make you melt from the inside. On the other hand, I have the responsibility as a journalist not to make people so frightened that they will be paralyzed or they will not take steps in their own defense or mischaracterize their own risk.”
Watch an edited version of my interview with her above.
Give each geek a blog and you’ll get a taste of the many flavors science can take on.
Some will be scholarly, crusading or probing, others whimsical or funny, but each flavor will reveal something about how its creator ticks. As Robert Krulwich, NPR’s science correspondent and keynote speaker at ScienceOnline 2011, said in an interview: “You can’t help yourself. You ask the question that your soul asks.”
Unlike the more than 200 registered bloggers at ScienceOnline 2011 who mingled Jan. 13 to Jan. 16 in Research Triangle Park, Krulwich doesn’t blog. But his Radio Lab podcasts and Nova videos represented one flavor. Darlene Cavalier, Mary Canady and Brian Malow provided distinctly different flavors. All four talked to Science in the Triangle about their approach. (Watch Krulwich’s interview here.)
Cavalier is a former Disney Publishing executive who outed herself as a former Philadelphia 76s cheerleader to advocate for science literacy. She started Science Cheearleader.com and helps match people without a hard science background with scientists who need help with research such as keeping records of birds’ migratory patterns, taking water samples or measuring the amount of snow fallen.
Watch Cavalier talk about her citizen scientist flavor:
Canady is a biochemistry Ph.D. who switched from bench science to marketing. In 2008, she started Comprendia.com, a virtual bioscience consulting group in San Diego and began thinking about whether marketing and science blogging can mix.
“We’re forging new trails here and need to be creative in thinking about these new relationships – think outside the box, as trite as it may sound,” she said during a ScienceOnline 2011 session.
The iron curtain between advertisement and content is best handled with care as last year’s Pepsigate at Scienceblogs.com showed. More than 20 contributors pulled out after postings by Pepsi scientists were to be published on the first-of-its-kind science blogging network.
But what about scientists posting on corporate blogs, companies sponsoring ask-an-expert forums and businesses underwriting independent blogs?
Here is Canady’s take on the business flavor:
Malow is a professional stand-up comedian with a liberal arts degree who is feeling his way into science comedy.
A voracious reader who is intrigued by astronomy, physics and evolution, he started adding jokes about particles, Star Wars and creationism to his repertoire a few years ago.
He said he wasn’t hired to perform at ScienceOnline 2011 but pulled together an entire show just hours before volunteering to go on stage.
Watch an uncut interview with Malow about his taste of fun:
The sci-fi action-shooter — a collaboration between Epic, People Can Fly and Electronic Arts — follows space pirate Grayson Hunt as he blasts his way through a paradise planet filled with destruction. Central to the gameplay are bonus points for “killing with skill” — finding creative ways to dispatch your enemies using combinations of outlandish weapons, the deadly environment and an energy whip.
Did I mention it was rated “M” for mature?
The demo is only about seven minutes long (six if you make the target time), but it’s packed with enough bullets and body parts to satiate most fans of first-person shooters.
I think the full game will too. At least for a little while. Read more…
Anything that sucks our blood while we lie in bed sleeping is bound to stir strong feelings. Think vampires and the many movies they have inspired even though vampires are at best folkore.
Bed bugs are real. They’re nocturnal but will come out during the day if they’re really hungry. They cannot live without human blood. They’re small but still visible. And as six-legged creepy crawlies their ick factor outranks any of the 170 movie versions of Count Dracula.
They’re also on the rebound.
A century ago, “Sleep tight, don’t let the bed bugs bite,” was not a children’s book. It was something parents said when they tucked in their children at night, and they meant it. Then the insects stopped being a pest in the U.S.
In the early 1990s, they were back in hotels, motels and private homes. Two decades later, the insects are becoming a nightmare in low-income housing, nursing homes and apartment buildings, said Coby Schal, an entomologist at N.C. State University who is a bed bug and cockroach expert.
“But it’s just the beginning of the problem,” Schal said Tuesday during a pizza lunch talk he gave at Sigma Xi in Research Triangle Park. Read more…
Monday, January 24
Brad Ball will discuss both the strategy and execution of loyalty marketing today. Loyalty Marketing is an approach to marketing in which an organization focuses on growing and retaining existing customers through incentive programs and targeted communications.
RSVP at http://marketingmondays.eventbrite.com/ ADMISSION IS FREE. Please visit us online at www.marketingmondays.org (our LinkedIn Group) or send an email to email@example.com.
Tuesday, January 25
Building your SEO Strategy
Register and find more details here.
Wednesday, January 26
A Green Future for Economic Development: The Dollars and Sense of Open Space
8AM – 3PM
McKimmon Center, NCSU, Raleigh
Open space preservation promotes vibrant economic development and attracts the talented workforce we need for the region while saving money and improving the quality of place and the health of our regions’ citizens.
Learn more here.
Do you have an idea that could turn into the next Facebook, Sham Wow or Snuggie? Do you want expert help shaping your idea and finding out how viable it might be? Come join us for our next RTP Idea Lab meeting on January 26th at 8:00 AM at RTP Headquarters at 12 Davis Drive. Space is limited to the first 100, so go online to register and submit ideas at www.rtpidealab.org to request participation.
RSVP at www.rtpidealab.
Thursday, January 27
Global Innovation Series
11:30AM – 2PM
This series of events is focused on identifying & promoting globally the state’s most innovative companies, individuals & groups. This series of co-creation and collaboration luncheons brings together the big thinkers and doers in our regions to share their perspectives and ideas. Our speakers for this roundtable discussion include: Frank Plastima, President and CEO of Tekelec Dr. John Hardin, Executive Director of North Carolina Board of Science and Technology Timm Crowder, Director of Innovation of CoE, GSK.
Members: $25. Non-Members: $45
More details here.
For a complete listing of professional, networking, and tech-based events in the Research Triangle Region, please visit the Science in the Triangle events calendar.
Painters develop a style – Van Gogh’s brush strokes, Pollock’s abstract drips, Mondrian’s intersecting black lines. Writers find a voice to express themselves. The signature storytelling of Robert Krulwich, NPR’s science correspondent, uses style and voice. He paints with sounds.
As the keynote speaker at ScienceOnline 2011, which from Jan. 13 through Jan. 16 brought together more than 300 science bloggers in North Carolina’s Research Triangle Park, Krulwich used some of his science videos and radio podcasts, including a Radio Lab recording from 2008, as examples.
The Radio Lab recording explored brain research into how we make choices and, among other people, featured Dr. Oliver Sacks, a neuroscientist at Columbia University Medical Center in New York and author of the book “Awakenings.”
In the recording, Sacks talked about the routines he has developed to minimize choices. One involved a weekly trip to the farmer’s market, where he would always buy two pounds of kidneys. One week something went wrong and the vendor misunderstood Sacks. Instead of two pounds, the vendor packed up 22 pounds of kidneys. Too shy to complain, Sacks said, he just took them, paid for them and carried them home.
“I should have thrown away this monstrous, palpitating bag of kidneys,” Sacks said.
“Then followed an increasingly nightmarish period in which I had kidneys for breakfast, for lunch. Kidneys stewed. Sweet kidneys,” he said. “Finally, after about 10 days by which I had eaten about 50, BLUEERRGHH, an incredible nausea and vomiting took hold of me.”
Sacks’ retching sound unequivocally answered the question of how much is too much in a way any kindergartener could understand.
To take the audience along while he discovers how things work is what he aims for, Krulwich said in his keynote talk. The sounds are there to drive home impressions along the journey.
His sound pictures work, they tell a story that you can understand and feel, because Krulwich is inquisitive and an explainer at heart.
“You can’t help yourself,” he said in an interview with Science in the Triangle. “You ask the question that your soul asks.”
This wanting to explain things has gotten him into trouble, as Krulwich acknowledged in the interview. He has been told that it will never make him famous. And it’s hard work. It may look effortless when he breaks down complex topics such as science, technology and economics in a way that his aunt Nancy who got a B- in biology understands. But it isn’t, he said.
Watch the interview with Krulwich:
Watch the science story that he said was the hardest to tell here.
What Brian Hare says might rub people who quibble about evolution the wrong way.
Hare, an assistant professor of evolutionary anthropology at the Duke Institute for Brain Sciences, says humans are apes.
Indeed, on the timeline that tracks the evolution of hominids, we are between chimpanzees and bonobos on the left and gorillas and orangutans on the right.
“Humans are slap dab in the middle of the great ape clade,” Hare said during a talk he gave Friday at N.C. State University’s biology department.
But wait a minute. We may share 98.7 percent of our genetic material with apes, but we’ve accomplished a lot more than they have. We speak and write books. We pray. We build cities and pay with money that’s part of a global financial system. We join different groups. We depose dictators. Apes live in trees. They grunt and scream. Their allegiances tend to be with one group only and they usually follow a strict ranking system.
To figure out how we humans got to be that way, researchers have begun to set up experiments with chimpanzees and bonobos, the apes most closely related to us. Hare’s research is based on these experiments. At Duke, for example, he has access to two sanctuaries, the Tchimpounga Natural Reserve in the Republic of Congo and Lola y Bonobo in the neighboring Democratic Republic of Congo, a country formerly known as Zaire. Read more…
The lights in the conference room before him are dimly lit, but Dan Amerson is still scanning faces in the crowd as he paces excitedly, silhouetted by the glow of the projector screen behind him. He’s explaining to the audience, matter-of-factly, about the critical elements missing from the motion gaming industry today.
For more than three decades, video games offered players an effective method of digitizing their actions and translating them to on-screen motion. With dials, buttons and joysticks, gamers could manipulate their virtual worlds without much effort. It was tactile. Simple. And particularly with next-generation consoles, it granted the ability to make and break contact with objects in the game with a twist, mash or thrust.
But what took off with the Nintendo Wii in 2006 and continued this holiday season with the Xbox Kinect and Playstation Move was a desire for a more active form of interaction — motion.
That’s both a problem and an opportunity for programmers like Amerson, vice president of engineering for middleware developer Activate3D. As it turns out, computers are downright terrible at figuring out what you’re trying to do when you don’t have buttons.
But Amerson’s plan is to equip games to recognize that subtlety, using what his company calls intention recognition and synthesis.
“A lot of motion games out there can take your motion and they can put it on-screen, but what they can’t do is let you grab onto that object in the world and let you do something meaningful,” Amerson told the crowd at RTP Headquarters in Durham, N.C., Dec. 8.
If his company is successful in bringing its technology to market, Amerson believes it will change the way people engage virtual environments.
“The Playstation 3 and the Xbox 360 have not changed since they came out, yet everyone wants to have a bigger, better, badder game. So how do we do that? Well, we have to write better code, we have to make our artists smarter, give them better tools, come up with new tricks,” Amerson said.
HISTORY OF FAILURE
Motion controlled games aren’t particularly new. Long before the Wii had consumers lining up outside stores in the cold, Mattel’s Power Glove for the Nintendo Entertainment System tracked course hand gestures in 1989. Despite grossing $88 million, it underwhelmed consumers.
Sega released its Activator peripheral in 1993, telling players in an elaborate four-minute instructional video how they were “pioneers on the interactive frontier.” The video also warned against placing the octagonal device, which worked when users broke an infrared beam, under overhead light sources or “metallic or mirrored ceilings.” It never caught on.
But as gaming systems became more powerful, peripherals manufacturers started getting the formula right. As a precursor to the more modern movement-based controllers, the Logitech EyeToy, released for the Sony PlayStation 2 before the holidays in 2003, attempted to capture motion by placing the player’s image on-screen using a camera. It sold 400,000 units in North America alone by the end of the year.
“People seem to have forgotten that there were games controlled solely by camera back on the PlayStation 2,” Amerson said. “As the technology moves forward, we’re going to get increasingly more accurate, better fidelity, interesting new combinations of the technology. We’ve now got the ability to use not just course motion, but actually some very precise motion.”
And that’s where devices like the Move and Kinect can succeed where others have failed, according to Michael Young, an associate professor of computer science at N.C. State who taught Amerson as an undergraduate (full disclosure: I’m employed by N.C. State as a journalism adviser).
“The real principle challenge is to correctly map a player’s intent to how they play the game,” Young, who teachers courses in video game design, said. “The greater the connection between the choices of the player and the feedback of the game, the greater the acceptance of the choices you have.”
But to cement that connection, Amerson says the Kinect and Move will need a little help from his company’s technology.
“Taking input data, taking someone cavorting in front of a camera and putting it on-screen is of limited interest. You’ll go do it sometime in your life and it’ll be fun. You’ll have a good time,” he said. “But 15 minutes later, you’ll realize that’s all there is to it.”
REDUCING THE NOISE
Booting up a small camera at the front of the dark Durham conference room, a miniaturized image of Amerson pops up in the corner of the screen behind him. Looming larger on the screen over the 6-foot-2-inch programmer’s shoulder is a teenaged avatar sporting a long-sleeved shirt and blue jeans, looking out over a vibrant playground.
The kid on-screen mimics Amerson’s motions until he approaches a set of virtual monkey bars. Miming a leap without ever leaving the ground, Amerson closes his hands as his on-screen persona grasps the bars and hangs free, ignoring the real Amerson’s legs rooted firmly on the conference room floor.
Actions like these aren’t easy for a program to understand, especially given the limited data from one camera.
Take grabbing things, for instance. The image of an opening and closing human hand can appear radically different depending on how it’s positioned. And absent a controller, that image is all the program has to go on.
So Activate3D’s Intelligent Character Motion software helps it make an educated guess. By processing dozens of images of open and closed hands, the system builds a mathematical model. The team then pipes in live video of their hands while the system guesses if they’re open, and humans make corrections along the way.
The software also recognizes what the player intends to do — jumping for example — without literal action. This could help games overcome an obvious constraint: there are only so many fun things you can do from inside your house.
“If I’m in front of a camera in my living room and I start walking, I quickly run into a physical limitation of gameplay when I knock over the camera or run smack into my TV,” Amerson said.
Translating real-world motion into virtual action also runs the risk of falling into the “uncanny valley,” where unnatural movement of almost-lifelike 3D animation actually grosses us out. ICM avoids that gut reaction by augmenting the player’s motion and removing irrelevant input — like the position of Amerson’s legs when his avatar is hanging in mid-air. By filtering that signal, the program allows virtual gravity to take effect, letting the legs swing and the shoulders rotate naturally.
“I can break the rules of the virtual world very easily,” Amerson said. “We want to take all this into account and augment that — make you look like you’re doing what’s happening there — and then blend all that together, make it fit the environment, fit the physics and make it believable to you.”
By staying away from the uncanny valley, Young said players will get a more immersive experience when they step into the “magic circle” of a video game.
“The relationship between the body and avatar, by default, is one to one,” Young said. “When it doesn’t happen, it pulls us out of the game.”
Although Amerson said there will always be great games that map motion literally, augmentation opens new possibilities for moving gaming forward.
“Give me a Kung Fu game. I can mimic those motions, I could pretend that I’m Jackie Chan,” he said. “But wouldn’t it be really awesome if, in my living room, I can pretend to be Jackie Chan and on TV see my avatar move with the grace and the fluidity and the expertise of Jackie Chan or Jet Li?”
And he said helping players inhabit actions that aren’t their own is what motion gaming needs to move from amusing to memorable.
“The best games give you 80 percent of the experience with 10 percent of the effort,” he said. “I think ultimately, that’s what game designers are trying to do — giving you as big and as bold an experience as possible with a low barrier of entry so it stays fun.”
Tuesday, January 18
RTP Headquarters, 12 Davis Drive
Topic: ScienceOnline 2011 – How has the World Wide Web changed how science is communicated online, taught and done.
Speakers: Panel from Science in the Triangle
RTP Headquarters, 12 Davis Drive
These events offer executives and entrepreneurs of North Carolina-based nanobiotechnology companies and others involved in this exciting field an opportunity to discuss key business issues and build relationships. Roundtable events will be hosted several times a year around the state and will feature presentations by CEOs and thought leaders on topics relevant to the business needs and opportunities for nanobio companies. The series provides a valuable networking opportunity for those involved in advancing nanobiotechnology innovations to market.
Wednesday, January 19
The Innovation Forum
First Flight Venture Center, 2 Davis Drive, RTP
$20 per person
Cash or check at the meeting
Click Here to RSVP to this event
Speaker: Richard West, CEO of Advanced Liquid Logic
The Innovalyst Innovation Forum is an exciting event that catalyzes radical thinking. Through its structured networking exercise and presentations, the Innovation Forum provides a purposeful approach towards learning more about life science innovation and about fellow innovators.
Thursday, January 20
The job fair is by invitation only, which means that in order to attend, you need to apply and be selected by one of the employers that are actively seeking to fill an open position. The jobs that are available are listed on our website: www.ibiliti-nc.com/connecting. Most of them are technical positions for scientists, engineers, regulatory, quality and sales professionals. There are more than 50 jobs listed there, and the list grows daily, so please check back often. There are open positions in Raleigh, Winston-Salem, Charlotte and throughout the state. If you are looking for a new medtech job in North Carolina, we urge you to apply.
Friday, January 21
Women Executives Roundtable
Brio Tuscan Grille, 4325 Glenwood Avenue, Raleigh
For a complete listing of professional and business related events in the Research Triangle Region, please visit the Science in the Triangle events calendar.