Nicholas Negroponte, founder of One Laptop per Child, and MIT Media Lab, says the laptop project turned kids into change agents. He adds that there’s a difference between education and learning. His newest work, the “edX” project, aims for kids to teach themselves. The first public discussion is being held here at MIT’s EmTech conference today.
Archive for October, 2012
Post from Jean Thilmany
My friend’s nine-year-old son, Jack, is the only boy enrolled in his school’s sewing class. Recently, he brought home his first creation, a pillow case, and his mother reports he was excited to sleep on it that night. Next, he’s turned his attention to pajama pants.
What does this have to do with engineering? Well, everything.
“More boys would take the class if they just realized it was engineering,” Jack told his mother. In fact, many things in this world have civil or electrical or environmental or biomechanical or other types of engineering at their base—including fashion design.
With the underrepresentation of women engineers, and the ongoing discussion of gender roles and gender stereotyping at a young age, I plan to go out of my way to point out to the kids in my life the underpinnings of engineering in the clothes we wear; and the bridge we cross over the Mississippi; and the dams we pass as we travel along the river; and in the car we share the automotive engine that propels us.
I’m sure I’ll think of a million more examples, including a look at the way that nature engineers all plants to allow them to thrive best (for more on this, see the Computing section in the November 2012 issue of Mechanical Engineering magazine).
I’d like to encourage you to have the same conversations with the children and adults in your life that I’ve had with my six-year-old son. By making them more aware of engineers’ efforts and their effects on our everyday life we can boost engineers’ presence in society and increase the young talent within their ranks.
Post from Harry Hutchinson:
I always tell people how cool my job is. I get to talk to all kinds of interesting people—researchers, inventors, regulators, rule-makers, and rebels. I see some very clever stuff and sometimes play with it.
Ahmed Noor is a frequent contributor to Mechanical Engineering magazine. His most recent article, “Intelligent and Connected” in the November issue, is a forward-looking discussion of smart transportation systems.
He also heads a lab, the Center for Advanced Engineering Environments, at Old Dominion University in Hampton, Va. When I was invited to an open house there, I was eager to go.
The lab works with commercial partners to further the development of computers as engineering tools, with a particular emphasis on communication and interfaces.
When I showed up, everyone was standing around talking to a telepresence robot. Michael Clark, from the Institute for Software Research at Carnegie Mellon, had brought it along. He has four of the things, which go by the brand name Anybots, and he studies ways to use them in education.
The robot fits into a box about the size of a nightstand. It rolls around on two wheels like a Segway. It has an adjustable pole for a neck and a head with two eyes that are cameras. One of the eyes contains a laser pointer. The robot’s forehead has a small screen where you can see the operator.
In this case it was Scott Friedman, an M.D. controlling the robot from his home near Pittsburgh, 350 miles away. Through the robot, Friedman could follow us from room to room, see what was going on, and make his presence known.
Another demonstration at the open house was a walk-through of a highly detailed plant simulation developed by Eon Reality. It simulated a petroleum site in Angola.
Mats Johansson, Eon’s president, said it was developed to train technicians. You can open a schematic of the plant, click on the site you want, and the simulation will show you how to get there and what you will see. Then you can walk someone through the steps of what to do. The job checklist shows up in a window on the screen.
Eon also has a prototype of augmented-reality spectacles that work with a smartphone. You can call up an engineering drawing or a CAD model, for example, and see it projected on the lenses while you’re working on the equipment.
There was also a presentation on another emerging technology. Noor’s lab is working with a company called Emotiv on a computer interface that reacts to brain waves.
A student from the lab, Hari Phaneendra Kalyan, showed how the system works. He was able to open Internet Explorer and make Google.com appear on the command line, just by thinking it.
Then I got to try my head at it.
The controller is an EEG headset with 14 contact points. A schematic on the computer screen shows them black (no contact), yellow (getting warm), or green (go).
Kalyan and a fellow student, Ben Cawrse, helped me get ready. We discovered it’s a challenge to make green contact with all the electrodes if you wear a pony tail.
After a bit of trial and error, they worked some electrodes under my hair and put others on my forehead and behind my ears. That gave us 11 green lights out of a possible 14.
Given that many greens, I thought hard to move the cursor, but nothing happened.
The graph of brain activity came on screen. The blue line, which reads frustration level, was very high, so I knew the system was working.
I said I was stymied in trying to move the cursor, so Cawrse moved to another window labeled “mouse” and clicked an icon. Suddenly, wherever I looked on the screen the cursor went with me. The unexpected ease made the experience downright eerie.
Cawrse switched to a page with a list of commands. He selected “push.”
An image on the page showed a box floating in the air. Kalyan told me to think hard about pushing it. So I did, gritting my teeth, even leaning into it.
Nothing happened. I wondered where my blue line was—probably pretty high.
Then the screen changed a little, but not because of anything I did or thought. As usual, I was about a page behind. Nothing was supposed to happen on the screen during that exercise. We were teaching the system to know my “push” brainwaves.
It learned well. When I thought “push” at the right time, that virtual box started to slide into cyberspace.
There is a team at the lab working with Emotiv. They include Kalyan and Cawrse, who are computer science students, and Ajay Gupta of the computer science faculty.
Ahmed Noor estimates that this technology right now is about where voice-recognition was 20 years ago. He told me the lab is moving towards more advanced applications.
You’re not going to create much by pulling or pushing images on a screen. But if you can do that much today just by thinking it, where is this technology going to be in 20 years? Will people now locked silenced by severe physical disabilities be able to share some of their genius with us?
Post from Jean Thilmany:
Last summer, South African runner Oscar Pistorius became the first double-amputee runner to race in the Olympic games. In November, Mechanical Engineering magazine’s Input/Output section will feature C.J. Howard, a Californian who climbs mountains with the help of a specially designed climbing prosthetic that fits his lower leg.
This is additional proof that prosthetic limb design continues to push the bounds of technological prowess with new materials, robotics, and manufacturing methods.
As amazing as today’s advances are, it is startling to realize how long humans have relied on prosthetic limbs. Recently, Jacky Finch, a biomedical Egyptology researcher at the University of Manchester in Manchester, England, found that artificial toes discovered on the foot of an Egyptian mummy are likely to be the world’s first prosthetic body parts.
The three-part wood and leather toe, dating before 600 B.C., had been found on a female mummy buried near Luxor, in Egypt. I could have been used as practical tools to help their owners to walk, Finch said.
All of which has put me in mind to re-read Poster Child, a wonderful memoir by Emily Rapp, about the leg prosthetic she now wears following an operation at the age of four. Rapp detailed the pain (both physical and psychological) of wearing the prosthetic and the need for it to be constantly re-fitted and changed as technology advanced. No matter the technology, the leg often chaffed and left wounds at the site where it attached.
That book was an eye opener for me. For behind the long-time human use of prosthetic limbs, behind the marvels that continue within the field today, prosthetics are still worn by humans who struggle with them and with their appearance. And that, likely, is something not about to change.
My October column in Mechanical Engineering magazine.
I’m writing this month’s column deep past midnight on the eve of what those of us in the publishing world have come to know dreadfully as “drop dead”— or the very last moment when our Midwest printer will accept final page proofs in order for the October issue to be printed, bound, and then shipped to you on time. Some of our key editors will complain that this isn’t the first time I’ve pulled this little stunt. But even as I write this, those same editors and I have yet to finalize the headline that will appear alongside this month’s captivating cover image when you receive the magazine.
Drop dead, as an editorial concept, is not unique to the traditional ink-on-paper printing process, it’s ubiquitous, common to editorial content dissemination on all formats—on a website, on a digital edition for a tablet, or chiseled somewhere in the blogosphere by an indiscriminate scribe.
The reason our cover is giving us headline trouble is that there’s more than an undercurrent of discord among us on how to succinctly and elegantly describe a new form of 3-D printers that are helping to reshape the way designers think about their craft. Hod Lipson, who teaches at Cornell University and is the author of the article, surmises that “years of observing mass-produced objects made subject to traditional manufacturing constraints” may be at play in why some designers occasionally come up empty when given the opportunity to show their design mettle.
In a not-so-subtle nudge at CAD systems, Lipson quips that design creativity may be stunted by the thinking imposed by using conventional software. Conceptually, he says, “CAD software remains today a 3-D drawing board that records our intentions but offers little insight or ideas of its own and offers limited access to the vast new space of geometric complexity.”
Making his case for a new generation of three-dimensional printers that empower designers with the control they never had before over the shape and composition of matter, Lipson argues that new tools democratize design and enable the growth of new types of designers, some of whom may not even have formal engineering training.
Lipson is not suggesting the demise or even a diminished role for the engineer in the design process, per se. What he is offering, however, is the suggestion that certainty is transient. And that being flexible and able to adapt to changing technologies is a key to increasing the probability of one’s long-term success.
In keeping with that sentiment, we here also have adapted and thus adopted distribution methods for the magazine that take advantage of technologies that aren’t reliant on traditional models. For example, you can now access a digital version of the magazine on asme.org and through your mobile or tablet devices.
Not long ago, magazines only came printed on paper, and the engineering designs that came from printers were blueprints. Though change brings uncertainty, in the end, it often helps us be better at the things that we do. And this is progress.