Manufacturing Blood

•August 21, 2008 • Leave a Comment

Gone may be the days when Hospitals have to struggle to maintain a sufficient amount of blood in the blood bank, run blood donation camps, and loose patients simply because they could not get the right blood at the right time. After the discovery of the enzyme that cleaves antigen, now we may have an even better option of producing limitless amount of blood through the use of stem cell technology. Truly mind boggling progress in Medical science!

Scientists have created red blood cells from human embryonic stem cells, in a step that they say could mean an infinite source of blood for transfusions.

According to the American Red Cross, 15 million units of blood are donated in the U.S. each year. Fourteen million units are transfused into Americans every year. The World Health Organization notes that people still die—especially in developing countries—because of an inadequate blood supply.

The team at Worcester, Mass.-based Advanced Cell Technology (ACT) says that if they can develop type-O negative blood—so-called “universal donor” blood because the body’s immune system will not reject it—there could be an inexhaustible supply. They were able to grow type A, type B, and type O blood, but did not make type O negative. ACT’s chief scientific officer Robert Lanza told Wired News that he doesn’t think it will be a problem to make oodles of O negative.

Susan Shurin, deputy director of the National Heart, Lung and Blood Institute in Bethesda, Md., called the new work an “important first step.”

“They’ve grown these in the test tubes and been able to get them to mature so that many aspects of these cells look like [red blood cells] that you make and I make,” she told ScientificAmerican.com. “But they haven’t given them to people and see if they survive.”

There are sugars on the surface of the cells, Shurin explains, that allow them to be recognized by the immune system. If the immune system sees them as risky, it could kill the cells. “There’s still significant work to do,” she says, “but it’s a significant step in the right direction.”

The American Red Cross agreed with Shurin’s assessment, releasing a press statement, calling the work “pioneering,” but also pointing out that the technique “has not progressed to the stage where the cultured cells are fully equivalent” to real red blood cells.

From around the Web:

‘Stem cells can lead to blood farms’Times of India

Skin cells produce library of diseased stem cellsScientific American

Stem cell technology may make blood donations thing of the pastThe Daily Telegraph

Human blood grown from stem cellsThe Australian

Quotes of Dale Carnegie

•August 20, 2008 • Leave a Comment

Dale CarnegieCircumstances

“It isn’t what you have, or who you are, or where you are, or what you are doing that makes you happy or unhappy. It is what you think about.”

Criticism

“Any fool can criticize, condemn, and complain – and most fools do.”

Fear

“Do the thing you fear to do and keep on doing it… that is the quickest and surest way ever yet discovered to conquer fear”

Timeless Observations on Life by great Ancient thinkers.

•August 10, 2008 • Leave a Comment

I read this nice article. It is something that really needs some pondering and assimilation of such wisdom of great thinkers of the ancient past. I shall not reproduce the whole article here as much of it had been the writer’s own comments, but the core of it for those like me who would like to have quick recapitulation, skipping much of the non-essential text.

Marcus Aurelius Antoninus was Roman emperor from 161 until his death in 180.
A great thinker, Marcus embodied Plato’s ideal of the philosopher king to a considerable extent.
He was the last of the “Five Good Emperors”, and is also considered one of the most important Stoic philosophers.

Marcus left behind a corpus of writing which, despite it’s antiquity, offers us some truly timeless wisdom. He wrote the twelve books of the Meditations in Greek as a source for his own guidance and self-improvement.

Here are six lessons we can learn from his observations on life.


Lesson 1: We Are Responsible for Our Own Experience of Life

“Such as are your habitual thoughts; such also will be the character of your mind; for the soul is dyed by the color of your thoughts.”

It is the first habit of a highly effective person to cultivate an awareness that s/he is in control. To coin a phrase, life is what you make it.


Lesson 2: Everything Changes

“Time is a sort of river of passing events, and strong is its current; no sooner is a thing brought to sight than it is swept by and another takes its place, and this too will be swept away.”

People who know this and tap into the natural course of change can be very successful. Clinging on to the way things were can be a source of great misery.
The past is gone and it’s never coming back; the present is already changing.

Lesson 3: Live a Real Life

“It is not death that a man should fear, but he should fear never beginning to live.”

Taking risks is no easy thing, but when we come to the end of it all, shall we regret that we stayed too much in our comfort zone?

Lesson 4: Be Grateful

“When you arise in the morning, think of what a precious privilege it is to be alive – to breathe, to think, to enjoy, to love.”

We take so many things for granted, and only when we lose them do we stop to think just how important they were to us. Every day is a gift, and there are so many, many things to be happy about.

Lesson 5: Be Detached

“Receive wealth or prosperity without arrogance; and be ready to let it go.”

Lesson 6: All Is Well

“Everything is unfolding as it must, and if you observe carefully, you will find this to be so.”

Life is a mystery, which means your thinking mind cannot make sense out of it. The world looks like a big mess to me, but if we take Marcus’ advice, sit quietly, abandon our opinions, and simply observe, then perhaps we shall indeed see that ‘all is well.’

Light Deprivation Causes Depression

•August 8, 2008 • 1 Comment


The Guantanamo bay prisoners were made to wear these
Eye shields and ear muffs meant to deprive them of all
senses both Light and sound – allegedly as a form of torture.

The association between darkness and depression is well established. Now a March 25 study in the Proceedings of the National Academy of Sciences reveals for the first time the profound changes that light deprivation causes in the brain.

Neuroscientists at the University of Pennsylvania kept rats in the
dark for six weeks. The animals not only exhibited depressive behavior
but also suffered damage in brain regions known to be underactive in
humans during depression. The researchers observed neurons that produce
norepi­nephrine, dopamine and serotonin—common neurotransmitters
involved in emotion, pleasure and cognition—in the process of dying.
This neuronal death, which was accompanied in some areas by compromised
synaptic connections
, may be the mechanism underlying the
darkness-related blues of seasonal affective disorder.

Principal investigator Gary Aston-Jones, now at the Medical
University of South Carolina, speculates that the dark-induced effects
stem from a disruption of the body’s clock. “When the circadian system
is not receiving normal light, that in turn might lead to changes in
brain systems that regulate mood,” he says.

Treating the rats with an antidepressant significantly ameliorated
brain damage and depressive behaviors
. “Our study provides a new animal
system for antidepressant devel­opment. Many existing animal models
depend on stress.
Our model is a stress-free means of producing a depression. It might be
parti­cularly relevant to seasonal affective disorder, but we think
that it is relevant to depression overall,” Aston-Jones says.

The article appeared on Scientific American.

The Illusion of Reality – An Amazing documentary on Nuclear Physics

•July 23, 2008 • 3 Comments


Professor Jim Al-Khalili explores how studying the atom forced us to rethink the nature of reality itself. The possibilities just keep getting limitless, as we stumble upon newer and newer discoveries deeper into the very core of existence – The minutest unit to which something can exist. This documentary resonates very well to my article that I posted “People who deny existence of God“. He envisages that there might be parallel universes in which different versions of us exist. You will know that Empty space isn’t empty at all. How knowledge can give us an entirely new perception and the knowledge of reality.

On the other note, it makes me wonder, how naive and irrelavent we are in this infinite universe, yet the ego within each of us makes us think of ourselves as the ultimate, strongest and the all-knowing. Humility therefore is often seen in only those who move on to acquire humongous knowledge that only makes them realize how irrelavent we are.

The Mind and Attention

•July 18, 2008 • 1 Comment


By Maggie Jackson published at boston.com

IN THE FAST-PACED, distraction-plagued arena of modern life, perhaps
nothing has come under more assault than the simple faculty of
attention. We bemoan the tug of war for our focus, joke uneasily about
our attention-deficit lifestyles, and worry about the seeming epidemic
of attention disorders among children.

The ability to pay careful attention isn’t important just for students and
air traffic controllers. Researchers are finding that attention is
crucial to a host of other, sometimes surprising, life skills: the
ability to sort through conflicting evidence, to connect more deeply
with other people, and even to develop a conscience.

But for all that, attention remains one of the most poorly understood human
faculties. Neither a subject nor a skill, precisely, attention is often
seen as a fixed, even inborn faculty that cannot be taught. Children
with attention problems are medicated; harried adults struggle to “pay
attention.” In a sense, our reigning view of attention hasn’t come far
from that of William James, the father of American psychological
research, who dolefully asserted a century ago that attention could not
be highly trained by “any amount of drill or discipline.”

But now scientists are rapidly rewriting that notion. After decades of
research powered by fresh advances in neuroimaging and genetics, many
scientists are drawing a much clearer picture of attention, which they
have come to see as an organ system like circulation or digestion, with
its own anatomy, circuitry, and chemistry. Building upon this new
understanding, researchers are discovering that skills of focus can be
bolstered with practice in both children and adults, including those
with attention-deficit disorders. In just five days of computer-based
training, the brains of 6-year-olds begin to act like adults on a
crucial measure of attention, one study found. Another found that
boosting short-term memory seems to improve children’s ability to stay
on task.

It is not yet known how long these gains last, or what the best methods
for developing attention may turn out to be. But the demand is clear:
Dozens of schools nationwide are already incorporating some kind of
attention training into their curriculum. And as this new arena of
research helps overturn long-standing assumptions about the
malleability of this essential human faculty, it offers intriguing
possibilities for a world of overload.

“If you have good attentional control, you can do more than just pay
attention to someone speaking at a lecture, you can control your
cognitive processes, control your emotions, better articulate your
actions,” says Amir Raz, a cognitive neuroscientist at McGill
University who is a leading attention researcher. “You can enjoy and
gain an edge in life.”

Attention has long fascinated humankind as a window into the mind and the world
in general, yet its workings have historically been murky.
Eighteenth-century scientists, who considered unwavering visual
observation crucial to scientific discovery, theorized that attention
was a “pooling” of nervous fluid. Later, Victorian scientists eagerly
probed the limits and vulnerability of attention, treating the subject
of their inquiry with a mix of puzzlement and admiration. “Whatever its
nature, [attention] is plainly the essential condition of the formation
and development of mind,” wrote Henry Maudsley in the early 1830s.

More recently, scientists have used advances in genetics and imaging
technologies that can map brain activity to formulate more detailed
theories of what, exactly, attention is. It has been compared to a
filter, a mental spotlight, and a tool for allocating our cognitive
resources. Increasingly however, attention is viewed as a complex
system comprising three networks, or types of attention: focus,
awareness, and “executive” attention, which governs planning and
higher-order decision-making. According to this model, first proposed
by University of Oregon neuroscientist Michael I. Posner, the three
attentional networks are independent, yet work closely together.

Armed with an improved sense of how attention works, Posner and others have
begun researching whether attention can be trained. And their findings
have been intriguing.

After years of research into how attention networks develop, Posner and
colleague Mary K. Rothbart began experimenting a few years ago with
training children’s attention. They targeted children 6 and under,
since executive attention develops rapidly between ages 4 and 7.
Inspired by computer-learning work with monkeys, Posner and Rothbart
created a five-day computer-based program to strengthen executive
attention skills such as working memory, self-control, planning, and
observation. Building on a known link between this attention network
and internal conflict resolution, one exercise challenges a child to
pick the larger of two groups of objects, such as apples or numerals.
In the latter case, the symbolic and the literal counts conflict,
forcing concentrated thought.

After the training, Posner and Rothbart reported that 6-year-olds showed a
pattern of activity in the anterior cingulate – a banana-shaped brain
region that is ground zero for executive attention – similar to that of
adults, along with slightly higher scores on IQ tests and a marked gain
in executive attention. The children who were the most inattentive
gained the most from the program. The results were published in the
Proceedings of the National Academy of Sciences, and have since been
replicated in similar experiments by Spanish researchers.

“We thought this was a long shot,” says Posner, a lanky septuagenarian with
a deep, rumbling voice. “Now I’ve changed my mind.” Though small-scale,
the results from his lab and others have been so remarkable that he and
Rothbart are now calling on educators at conferences and in their book,
“Educating the Human Brain,” to consider teaching attention in
preschool.

“We should think of this work not just as remediation, but as a normal part
of education,” Posner said in an address to the American Psychological
Association in 2003, when he presented preliminary findings.

A parallel line of investigation is based on the close link between
attention and memory. “Working memory” is the short-term cognitive
storehouse that helps us recall a phone number or the image of a
landscape; this type of memory is integral to executive attention.
Tapping into this link, cognitive neuroscientist Torkel Klingberg of
Sweden’s Karolinska Institute devised computer software to improve
executive attention by training working memory in teens and
pre-adolescents with attention-deficit/hyperactivity disorder.

Using a training program he calls “RoboMemo,” Klingberg has helped children
improve their working memory and complex reasoning skills, according to
studies published in the Journal of the American Academy of Child and
Adolescent Psychiatry, among other publications. This appears to pay
off in attention as well: The children were also reported to be less
impulsive and inattentive by their parents, although their teachers
largely did not report those behavioral improvements.

Christopher Lucas of New York University, one of the US researchers using
Klingberg’s software, used the RoboMemo training program to boost the
visuospatial memory of a group of children, and found that as this type
of working memory improved, they became more focused and compliant.
Lucas, a psychiatrist, cautioned that such memory training isn’t a
quick fix for attention-deficit disorders. Working memory “is one of
the areas that’s implicated in ADHD,” he says. “I don’t think it’s the
whole story.”

Other attention research eschews that kind of technology, instead
investigating the attention-boosting potential of something very
different: the 2,500-year-old tradition of meditative practice. With a
long history but little scientific data on its effects, meditation has
begun to intrigue neuroscientists in labs around the country, who are
measuring the success of meditative practices that boost skills of
focus and awareness.

Lidia Zylowska, an assistant clinical professor in psychiatry at UCLA,
cofounded the university’s Mindful Awareness Research Center and is a
pioneer in the study of meditation’s impact on human focus and
attention.

In one study, Zylowska and colleagues reported that eight weeks of
mindfulness meditation – a technique designed to improve attention and
well-being largely by focusing on breathing – boosted both powers of
focus and self-control in 24 adults and eight teens with ADHD. The work
was published in May in the Journal of Attention Disorders. Others are
finding similar gains from meditation in those without ADHD.
Preliminary results from the largest attention-training study to date,
which tracked 64 people meditating full-time for three months, reveal
improved sustained attention and visual discrimination, says the lead
researcher, UC Davis neuroscientist Clifford Saron, who presented the
results at the Cognitive Neuroscience Society’s annual meeting in April.

If focus skills can be groomed, as research has begun to hint, the
important next question is whether, and how, attention should be
integrated into education. Will attention become a 21st-century
“discipline,” a skill taught by parents, educators, even employers?
Already a growing number of educators are showing interest in attention
training, mostly through the practice of meditation in the classroom.

Susan Kaiser Greenland, a former corporate lawyer who started the nonprofit
InnerKids Foundation in 2001 to teach meditation practices in
communities and schools, says demand outstrips her staffing. The Santa
Monica, Calif.-based nonprofit works with children ages 4 to 12.

“The kids are stressed out, they are distracted, and they are not able to
sit still,” she says. “There are more schools interested in our work
than we can possibly serve.”

But with the field of attention training still in its infancy, scientists
don’t yet understand if any current teaching has long-lasting gains –
or, for that matter, which practices work best. Some researchers, for
example, question computer-based efforts as too narrow in scope,
arguing that children must be taught attention holistically, as a life
skill. No brief training regime is likely to be a magic bullet, they
say.

“Part of the problem in today’s society is that people are looking for
extremely quick fixes that have no vision. People are looking to lose
20 pounds for the wedding next week,” says Raz at McGill. “But
attention training is a slow process.”

Nonetheless, with global use of controversial ADHD medicines tripling since the
early 1990s and evidence mounting that attention can be strengthened,
researchers are permitting themselves a bit of cautious excitement at
the prospect that attention training could work, especially for
children.

“Attention is such a basic skill that children need, and to be able to impact that
skill, to teach them how to redirect their attention and how to become
more aware of themselves, their bodies, emotions, and thoughts – it’s
an exciting thing,” says Zylowska. “It’s also critical.”

The stroke that caused Bliss and Euphoria!

•June 12, 2008 • Leave a Comment

Came across this article on the New York Times, while spending some of my own favourite time reading on Brain science, which thrills me so much.
I thought of sharing all that I read in some brief articles here but a packed schedule and other commitments always fail me. I promise to come out with some good personally composed articles to enlighten many on advances in Brain Science.

For the time being, let me share this awesome article.
It throws some light on how our chit-chattery, worked up, egoistic left brain hemisphere can be overowered.
I wish I can gain some mastery in this, not so easy though, but we can all try spending some secluded time in the Right hemisphere of our brain. This lady, a neuroscientist did experience the bliss of her RIGHT brain but only after her stroke had paralysed her LEFT. This makes an interesting read.


(courtesy Kauffman L, NYT)

JILL BOLTE TAYLOR was a neuroscientist working at Harvard’s brain research center when she experienced nirvana.

JILL BOLTE TAYLORBut she did it by having a stroke.

On Dec. 10, 1996, Dr. Taylor, then 37, woke up in her apartment near Boston with a piercing pain behind her eye. A blood vessel in her brain had popped. Within minutes, her left lobe — the source of ego, analysis, judgment and context — began to fail her. Oddly, it felt great.

The incessant chatter that normally filled her mind disappeared. Her everyday worries — about a brother with schizophrenia and her high-powered job — untethered themselves from her and slid away.

Her perceptions changed, too. She could see that the atoms and molecules making up her body blended with the space around her; the whole world and the creatures in it were all part of the same magnificent field of shimmering energy.

“My perception of physical boundaries was no longer limited to where my skin met air,” she has written in her memoir, “My Stroke of Insight,” which was just published by Viking.

After experiencing intense pain, she said, her body disconnected from her mind. “I felt like a genie liberated from its bottle,” she wrote in her book. “The energy of my spirit seemed to flow like a great whale gliding through a sea of silent euphoria.”

While her spirit soared, her body struggled to live. She had a clot the size of a golf ball in her head, and without the use of her left hemisphere she lost basic analytical functions like her ability to speak, to understand numbers or letters, and even, at first, to recognize her mother. A friend took her to the hospital. Surgery and eight years of recovery followed.

Her desire to teach others about nirvana, Dr. Taylor said, strongly motivated her to squeeze her spirit back into her body and to get well.

This story is not typical of stroke victims. Left-brain injuries don’t necessarily lead to blissful enlightenment; people sometimes sink into a helplessly moody state: their emotions run riot. Dr. Taylor was also helped because her left hemisphere was not destroyed, and that probably explains how she was able to recover fully.

Today, she says, she is a new person, one who “can step into the consciousness of my right hemisphere” on command and be “one with all that is.”

She brings a deep personal understanding to something she long studied: that the two lobes of the brain

have very different personalities. Generally, the left brain gives us context, ego, time, logic. The right brain gives us creativity and empathy. For most English-speakers, the left brain, which processes language, is dominant. Dr. Taylor’s insight is that it doesn’t have to be so.

Her message, that people can choose to live a more peaceful, spiritual life by sidestepping their left brain, has resonated widely.

In February, Dr. Taylor spoke at the Technology, Entertainment, Design conference (known as TED), the annual forum for presenting innovative scientific ideas. The result was electric. After her 18-minute address was posted as a video on TED’s Web site, she become a mini-celebrity. More than two million viewers have watched her talk, and about 20,000 more a day continue to do so. An interview with her was also posted on Oprah Winfrey’s Web site, and she was chosen as one of Time magazine’s 100 most influential people in the world for 2008.

Originally, Dr. Taylor became a brain scientist — she has a Ph.D. in life sciences with a specialty in neuroanatomy — because she has a mentally ill brother who suffers from delusions that he is in direct contact with Jesus. And for her old research lab at Harvard, she continues to speak on behalf of the mentally ill.

But otherwise, she has dialed back her once loaded work schedule. Her house is on a leafy cul-de-sac minutes from Indiana University, which she attended as an undergraduate and where she now teaches at the medical school.

Her foyer is painted a vibrant purple. She greets a stranger at the door with a warm hug. When she talks, her pale blue eyes make extended contact.

Never married, she lives with her dog and two cats. She unselfconsciously calls her mother, 82, her best friend.

Dr. Taylor says, “nirvana exists right now.”

“There is no doubt that it is a beautiful state and that we can get there,” she said.

Dr. Benes makes clear that she still thinks Dr. Taylor is an extraordinary and competent woman. “It is just that the mystical side was not apparent when she was at Harvard,” Dr. Benes said.

Dr. Taylor makes no excuses or apologies, or even explanations. She says instead that she continues to battle her left brain for the better. She gently offers tips on how it might be done.

“As the child of divorced parents and a mentally ill brother, I was angry,” she said. Now when she feels anger rising, she trumps it with a thought of a person or activity that brings her pleasure. No meditation necessary, she says, just the belief that the left brain can be tamed.

Her newfound connection to other living beings means that she is no longer interested in performing experiments on live rat brains, which she did as a researcher.

She is committed to making time for passions — physical and visual — that she believes exercise her right brain, including water-skiing, guitar playing and stained-glass making. A picture of one of her intricate stained-glass pieces — of a brain — graces the cover of her book.

Karen Armstrong, a religious historian who has written several popular books including one on the Buddha, says there are odd parallels between his story and Dr. Taylor’s.

“Like this lady, he was reluctant to return to this world,” she said. “He wanted to luxuriate in the sense of enlightenment.”

But, she said, “the dynamic of the religious required that he go out into the world and share his sense of compassion.”

And in the end, compassion is why Dr. Taylor says she wrote her memoir. She thinks there is much to be mined from her experience on how brain-trauma patients might best recover and, in fact, she hopes to open a center in Indiana to treat such patients based on those principles.

And then there is the question of world peace. No, Dr. Taylor doesn’t know how to attain that, but she does think the right hemisphere could help. Or as she told the TED conference:

“I believe that the more time we spend choosing to run the deep inner peace circuitry of our right hemispheres, the more peace we will project into the world, and the more peaceful our planet will be.”

It almost seems like science.