News 2020

NEWS 2020

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

November 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

Chemicals in Your Living Room Cause Diabetes

 

 

 

 

 

 

 

 

 

 

 

 

 

“A new UC Riverside study shows flame retardants found in nearly every American home cause mice to give birth to offspring that become diabetic.

These flame retardants, called PBDEs, have been associated with diabetes in adult humans. This study demonstrates that PBDEs cause diabetes in mice only exposed to the chemical through their mothers.

“The mice received PBDEs from their mothers while they were in the womb and as young babies through mother’s milk,” said Elena Kozlova, lead study author, and UC Riverside neuroscience doctoral student. “Remarkably, in adulthood, long after the exposure to the chemicals, the female offspring developed diabetes.”

Results of the study have been published in the journal Scientific Reports.

PBDEs are common household chemicals added to furniture, upholstery, and electronics to prevent fires. They get released into the air people breathe at home, in their cars, and in airplanes because their chemical bond to surfaces is weak.

“Even though the most harmful PBDEs have been banned from production and import into the U.S., inadequate recycling of products that contain them has continued to leach PBDEs into water, soil, and air. As a result, researchers continue to find them in human blood, fat, fetal tissues, as well as maternal breast milk in countries worldwide.”

Given their previous association with diabetes in adult men and women, and in pregnant women, Curras-Collazo and her team wanted to understand whether these chemicals could have harmful effects on children of PBDE-exposed mothers. But such experiments can only be done on mice.

Diabetes leads to elevated levels of blood glucose, or blood sugar. After a meal, the pancreas releases insulin, a hormone that helps cells utilize glucose sugar from food. When cells are resistant to insulin, it doesn’t work as intended, and levels of glucose remain high in the blood even when no food has been eaten.

Chronically high levels of glucose can cause damage to the eyes, kidneys, heart, and nerves. It can also lead to life-threatening conditions.

“This study is unique because we tested both the mothers and their offspring for all the hallmarks of diabetes exhibited in humans,” Curras-Collazo said. “This kind of testing has not been done before, especially on female offspring.”

The researchers gave PBDEs to the mouse mothers at low levels comparable to average human environmental exposure both during pregnancy and lactation.

All of the babies developed glucose intolerance, high fasting glucose levels, insulin insensitivity, and low blood insulin levels, which are all hallmarks of diabetes. In addition, researchers also found the babies had high levels of endocannabinoids in the liver, which are molecules associated with appetite, metabolism, and obesity.

“We need to know if human babies exposed to PBDEs both before and after birth go on to become diabetic children and adults,” Kozlova said.

In the meantime, Curras-Collazo advises people to limit PBDE exposure by taking steps such as washing hands before eating, vacuuming frequently, and buying furniture and other products that do not contain it. She also hopes expectant mothers are well informed about stealth environmental chemicals that can affect their unborn and developing children, as well as their breast milk.

“We believe the benefits babies get from mothers’ milk far outweigh the risks of passing on the PBDEs to children. We do not recommend curtailing breastfeeding,” she said. “But let’s advocate for protecting breast milk and our bodies from killer couch chemicals.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Malice Leaves a Nasty Smell

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Unhealthy behaviours trigger moral judgments that are similar to the basic emotions that contribute to our ability to survive. Two different hypotheses are to be found in the current scientific literature as to the identity of these emotions. Some researchers single out disgust, while others opt for pain. After developing a new approach to brain imaging, a research team from the University of Geneva (UNIGE) has come down on the side of disgust.

The study, which can be found in Science Advances, shows that unhealthy behaviours trigger brain responses that are similar to those prompted by bad smells. The research also identifies for the first time a biomarker in the brain for disgust.

Disgust is a basic emotion linked to our survivability. Smell provides information about the freshness of foodstuffs, while disgust means we can take action to avoid a potential source of poisoning. Following the same principle, pain helps us cope with any injuries we might suffer by activating our withdrawal reflexes. Psychologists believe that these types of survival reflexes might come into play in response to other people’s bad behaviour.

Disgust or pain

“These connections have been demonstrated via associations between situations and sensations,” begins Professor Corrado Corradi-Dell’Acqua, a researcher in UNIGE’s Department of Psychology and the study’s lead investigator. “For instance, if I drink something while reading an article about corruption that affects my moral judgment, I may find that my drink smells bad and tastes vile. Equally, the reverse is true: smells can generate inappropriate moral judgment. In concrete terms, if someone smells bad, other people tend to make the judgment that they’re unhealthy.”

While some studies suggest that disgust is involved in the process, others opt for pain, since they consider that moral judgments are made based on actual facts – hence the parallel with the mechanisms involved in pain. “If a driver is distracted, and does not see a pedestrian crossing a road, I will judge this person more negatively if the pedestrian was actually harmed, rather than avoided by chance”, explains the psychologist. His team set up an experimental paradigm and customised magnetic resonance imaging (MRI) techniques in an attempt to decide between the contradictory hypotheses.

The train dilemma as a paradigm

The first step was for Corradi-Dell’Acqua’s laboratory to subject volunteers to unpleasant odours or heat-induced pain. “The whole idea was to elicit a similar degree of discomfort with the two techniques so that they could work on the same levels.” Once the calibration had been performed, participants in the study were subjected to readings that evoked value judgments.

We used the train dilemma when five people are stuck on a railway track as a train approaches. The only possible way to save them is to push someone off the top of a bridge so that the switch is hit as they fall. In other words, it’s necessary to kill one person to save five in a highly immoral situation,” explains the researcher.

The act of reading this unpleasant dilemma had an influence on the odours the participants smelt and caused disgust, but did not influence the pain, an outcome that was backed up by the participants’ electrodermal activity. This is a physiological measurement of the electrical conductance of the skin. It reflects the rate of sweating and the activity of the nervous system responsible for involuntary behaviour.

Professor Corradi-Dell’Acqua then concentrated on the brain response. “It is difficult to infer pain and disgust from neural activity, as these two experiences often recruit the same brain areas. To dissociate them, we had to measure the global neuronal activity via MRI rather than focusing on specific regions,” summarises the researcher. The Geneva team adopted a technique that allows predicting disgust and pain from the overall brain activity, such as specific biomarkers.

Using this tool, the researchers were able to prove that the overall brain response to disgust was influenced by previous moral judgment. Once again, moral judgments are indeed associated with disgust. “In addition to this important discovery for psychology, this study was the occasion for the development of a biomarker prototype for olfactory disgust. It’s a double step forward!” concludes Corradi-Dell’Acqua.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Why Motivation to Learn Declines With Age

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“As people age, they often lose their motivation to learn new things or engage in everyday activities. In a study of mice, MIT neuroscientists have now identified a brain circuit that is critical for maintaining this kind of motivation.

This circuit is particularly important for learning to make decisions that require evaluating the cost and reward that come with a particular action. The researchers showed that they could boost older mice’s motivation to engage in this type of learning by reactivating this circuit, and they could also decrease motivation by suppressing the circuit.

“As we age, it’s harder to have a get-up-and-go attitude toward things,” says Ann Graybiel, an Institute Professor at MIT and member of the McGovern Institute for Brain Research. “This get-up-and-go, or engagement, is important for our social well-being and for learning — it’s tough to learn if you aren’t attending and engaged.”

Graybiel is the senior author of the study, which appears today in Cell. The paper’s lead authors are Alexander Friedman, a former MIT research scientist who is now an assistant professor at the University of Texas at El Paso, and Emily Hueske, an MIT research scientist.

Evaluating cost and benefit

The striatum is part of the basal ganglia — a collection of brain centers linked to habit formation, control of voluntary movement, emotion, and addiction. For several decades, Graybiel’s lab has been studying clusters of cells called striosomes, which are distributed throughout the striatum. Graybiel discovered striosomes many years ago, but their function had remained mysterious, in part because they are so small and deep within the brain that it is difficult to image them with functional magnetic resonance imaging (fMRI).

In recent years, Friedman, Graybiel, and colleagues have discovered that striosomes play an important role in a type of decision-making known as approach-avoidance conflict. These decisions involve choosing whether to take the good with the bad — or to avoid both — when given options that have both positive and negative elements. An example of this kind of decision is having to choose whether to take a job that pays more but forces a move away from family and friends. Such decisions often provoke great anxiety.

In a related study, Graybiel’s lab found that striosomes connect to cells of the substantia nigra, one of the brain’s major dopamine-producing centers. These studies led the researchers to hypothesize that striosomes may be acting as a gatekeeper that absorbs sensory and emotional information coming from the cortex and integrates it to produce a decision on how to act. These actions can then be invigorated by the dopamine-producing cells.

The researchers later discovered that chronic stress has a major impact on this circuit and on this kind of emotional decision-making. In a 2017 study performed in rats and mice, they showed that stressed animals were far more likely to choose high-risk, high-payoff options, but that they could block this effect by manipulating the circuit.

In the new Cell study, the researchers set out to investigate what happens in striosomes as mice learn how to make these kinds of decisions. To do that, they measured and analyzed the activity of striosomes as mice learned to choose between positive and negative outcomes.

During the experiments, the mice heard two different tones, one of which was accompanied by a reward (sugar water), and another that was paired with a mildly aversive stimulus (bright light). The mice gradually learned that if they licked a spout more when they heard the first tone, they would get more of the sugar water, and if they licked less during the second, the light would not be as bright.

Learning to perform this kind of task requires assigning value to each cost and each reward. The researchers found that as the mice learned the task, striosomes showed higher activity than other parts of the striatum, and that this activity correlated with the mice’s behavioral responses to both of the tones. This suggests that striosomes could be critical for assigning subjective value to a particular outcome.

“In order to survive, in order to do whatever you are doing, you constantly need to be able to learn. You need to learn what is good for you, and what is bad for you,” Friedman says.

“A person, or this case a mouse, may value a reward so highly that the risk of experiencing a possible cost is overwhelmed, while another may wish to avoid the cost to the exclusion of all rewards. And these may result in reward-driven learning in some and cost-driven learning in others,” Hueske says.

The researchers found that inhibitory neurons that relay signals from the prefrontal cortex help striosomes to enhance their signal-to-noise ratio, which helps to generate the strong signals that are seen when the mice evaluate a high-cost or high-reward option.

Loss of motivation

Next, the researchers found that in older mice (between 13 and 21 months, roughly equivalent to people in their 60s and older), the mice’s engagement in learning this type of cost-benefit analysis went down. At the same time, their striosomal activity declined compared to that of younger mice. The researchers found a similar loss of motivation in a mouse model of Huntington’s disease, a neurodegenerative disorder that affects the striatum and its striosomes.

When the researchers used genetically targeted drugs to boost activity in the striosomes, they found that the mice became more engaged in performance of the task. Conversely, suppressing striosomal activity led to disengagement.

In addition to normal age-related decline, many mental health disorders can skew the ability to evaluate the costs and rewards of an action, from anxiety and depression to conditions such as PTSD. For example, a depressed person may undervalue potentially rewarding experiences, while someone suffering from addiction may overvalue drugs but undervalue things like their job or their family.

The researchers are now working on possible drug treatments that could stimulate this circuit, and they suggest that training patients to enhance activity in this circuit through biofeedback could offer another potential way to improve their cost-benefit evaluations.

“If you could pinpoint a mechanism which is underlying the subjective evaluation of reward and cost, and use a modern technique that could manipulate it, either psychiatrically or with biofeedback, patients may be able to activate their circuits correctly,” Friedman says.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Why Low Oxygen Damages the Brain

 

 

 

 

 

 

 

 

 

 

 

“Brain cell dysfunction in low oxygen is, surprisingly, caused by the very same responder system that is intended to be protective, according to a new published study by a team of researchers at the Case Western Reserve University School of Medicine.

“These powerful protein responders initially protect brain cells from low oxygen as expected, but we find that their prolonged activity leads to unintended collateral damage that ultimately impairs brain cell function,” said the study’s principal investigator Paul Tesar, a professor in the Department of Genetics and Genome Sciences at the Case Western Reserve School of Medicine and the Dr. Donald and Ruth Weber Goodman Professor of Innovative Therapeutics.

Defining the mechanism of brain-cell damage in low oxygen conditions provides an opportunity to develop effective therapies, including a class of drugs studied in their research that could inform future clinical approaches for many neurological diseases caused by low oxygen. The work also clarifies how the response to low oxygen causes disease in other tissues outside the brain.

Their research was published online Oct. 21 in the journal Cell Stem Cell.

The body’s response to low oxygen

With the dawn of an oxygenated atmosphere, a burst of multicellular life was possible, as oxygen could be used to produce the energy needed to support complex life functions. Given the requirement of oxygen for life, nearly all organisms evolved a mechanism to rapidly respond to low oxygen–a condition called hypoxia. The Noble Prize in Physiology or Medicine was awarded in 2019 for discoveries of how cells in our body sense low oxygen levels and respond to stay alive.

At the core of this ancient response are proteins called hypoxia-inducible factors (HIFs), which instruct the cell to minimize oxygen consumption and maximize their access to oxygen. In this way, HIFs can be thought of as valiant heroes attempting to protect and resuscitate cells in the immediate response to low oxygen.

Prolonged hypoxia causes dysfunction in many tissues. In particular, stem cells in the brain are impaired by hypoxia in many diseases, including stroke, cerebral palsy related to premature birth, respiratory distress syndromes, multiple sclerosis and vascular dementia. Even the significant neurological damage caused by COVID-19 is attributed to hypoxia.

Until now, the precise causes of cell malfunction due to low oxygen were unknown.

The dark side of the hypoxia response

In this study, researchers developed a new approach to closely study how the hypoxia responder proteins function. By comparing how they work in brain-stem cells with other tissues, such as heart and skin, the scientists confirmed that the hypoxia responder proteins perform a beneficial function to promote cell survival in low oxygen in all tissues. However, these same hypoxia responder proteins had a previously unappreciated dark side, as they also switched on other cellular processes outside of the core beneficial response.

The team then demonstrated that this additional–and previously unknown–response is what impaired brain-stem cell function. This suggests that, while hypoxia responder proteins evolved to promote cell survival in all tissues of the body in low-oxygen conditions, their powerful effects can also have unintended consequences to disrupt cell function.

New opportunities for treating hypoxia damage

The authors tested thousands of drugs to try to restore brain-stem cell function to overcome the damaging effects of the hypoxia responder proteins. They discovered a group of drugs that specifically overcome the damage-inducing response, while leaving the beneficial response intact.

One of the exciting avenues that stems from this work is identifying drugs that specifically target the damaging side of the hypoxia response while sparing the beneficial side,” said first author Kevin Allan, a graduate student in Case Western’s Medical Scientist Training Program. “This offers a new perspective on combating tissue damage due to hypoxia.”

“Whether the damaging side of the hypoxia response is solely an unintended pathological effect or potentially a previously undiscovered normal process that goes awry in disease remains unknown,” Tesar said. “Our work opens the door to a new way of thinking about how cells respond to low oxygen in health and disease.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

October 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Machine Learning Predicts How Long Museum Visitors Will Engage With Exhibits

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“In a proof-of-concept study, education and artificial intelligence researchers have demonstrated the use of a machine-learning model to predict how long individual museum visitors will engage with a given exhibit. The finding opens the door to a host of new work on improving user engagement with informal learning tools.

“Education is an important part of the mission statement for most museums,” says Jonathan Rowe, co-author of the study and a research scientist in North Carolina State University’s Center for Educational Informatics (CEI). “The amount of time people spend engaging with an exhibit is used as a proxy for engagement and helps us assess the quality of learning experiences in a museum setting. It’s not like school – you can’t make visitors take a test.”

“If we can determine how long people will spend at an exhibit, or when an exhibit begins to lose their attention, we can use that information to develop and implement adaptive exhibits that respond to user behavior in order to keep visitors engaged,” says Andrew Emerson, first author of the study and a Ph.D. student at NC State.

“We could also feed relevant data to museum staff on what is working and what people aren’t responding to,” Rowe says. “That can help them allocate personnel or other resources to shape the museum experience based on which visitors are on the floor at any given time.”

To determine how machine-learning programs might be able to predict user interaction times, the researchers closely monitored 85 museum visitors as they engaged with an interactive exhibit on environmental science. Specifically, the researchers collected data on study participants’ facial expressions, posture, where they looked on the exhibit’s screen and which parts of the screen they touched.

The data were fed into five different machine-learning models to determine which combinations of data and models resulted in the most accurate predictions.

We found that a particular machine-learning method called ‘random forests’ worked quite well, even using only posture and facial expression data,” Emerson says.

The researchers also found that the models worked better the longer people interacted with the exhibit, since that gave them more data to work with. For example, a prediction made after a few minutes would be more accurate than a prediction made after 30 seconds. For context, user interactions with the exhibit lasted as long as 12 minutes.

We’re excited about this, because it paves the way for new approaches to study how visitors learn in museums,” says Rowe. “Ultimately, we want to use technology to make learning more effective and more engaging.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

High-fiber diet, low level inflammation: Sidestepping the effects of radiation

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“Loved or hated, the humble oat could be the new superfood for cancer patients as international research shows a diet rich in fibre could significantly reduce radiation-induced gut inflammation.

Conducted by the University of Gothenburg, Lund University and the University of South Australia, the preclinical study found that dietary oat bran can offset chronic gastrointestinal damage caused by radiotherapy, contradicting long-held clinical recommendations.

Gastroenterology and oncology researcher UniSA’s Dr Andrea Stringer says the research provides critical new insights for radiology patients.

“Cancer patients are often advised to follow a restricted fibre diet. This is because a diet high in fibre is believed to exacerbate bloating and diarrhea — both common side effects of radiotherapy,” Dr Stringer says.

“Yet, this advice is not unequivocally evidence-based, with insufficient fibre potentially being counterproductive and exacerbating gastrointestinal toxicity.

“Our study compared the effects of high-fibre and no-fibre diets, finding that a fibre-free diet is actually worse for subjects undergoing radiotherapy treatment.

“A diet without fibre generates inflammatory cytokines which are present for a long time following radiation, resulting in increased inflammation of the digestive system.

“Conversely, a fibre-rich diet decreases the presence of cytokines to reduce radiation-induced inflammation, both in the short and the long term.”

Intestinal issues following radiotherapy are problematic for many cancer survivors.

“In Europe, approximately one million pelvic-organ cancer survivors suffer from compromised intestinal health due to radiation-induced gastrointestinal symptoms,” Dr Stringer says.

“This is also commonplace in Australia and around the world with no immediate cure or effective treatment.

“If we can prevent some of inflammation resulting from radiation simply by adjusting dietary fibre levels, we could improve long-term, and possibly life-long, intestinal health among cancer survivors.” Science Daily

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The brain rhythms that detach us from reality

 

 

 

 

 

 

 

 

 

 

 

 

 

The rhythmic activity of a single layer of neurons has now been shown to cause dissociation — an experience involving a feeling of disconnection from the surrounding world.

“The state of dissociation is commonly described as feeling detached from reality or having an ‘out of body’ experience. This altered state of consciousness is often reported by people who have psychiatric disorders arising from devastating trauma or abuse. It is also evoked by a class of anaesthetic drug, and can occur in epilepsy. The neurological basis of dissociation has been a mystery, but writing in Nature, Vesuna et al.1 describe a localized brain rhythm that underlies this state. Their findings will have far-reaching implications for neuroscience.

Only the dissociative drugs produced robust oscillations in neuronal activity in a brain region called the retrosplenial cortex. This region is essential for various cognitive functions, including episodic memory and navigation2. The oscillations occurred at a low frequency, of about 1–3 hertz. By contrast, non-dissociative drugs such as the anaesthetic propofol and the hallucinogen lysergic acid diethylamide (LSD) did not trigger this rhythmic retrosplenial activity.

Vesuna et al. examined the active cells in more detail using a high-resolution approach called two-photon imaging. This analysis revealed that the oscillations were restricted to cells in layer 5 of the retrosplenial cortex. The authors then recorded neuronal activity across multiple brain regions. Normally, other parts of the cortex and subcortex are functionally connected to neuronal activity in the retrosplenial cortex; however, ketamine caused a disconnect, such that many of these brain regions no longer communicated with the retrosplenial cortex.

The researchers next asked whether inducing the retrosplenial rhythm could cause dissociation. They made use of mice in which layer-5 cells were modified to simultaneously express two ion-channel proteins that are sensitive to light.

The authors then deleted two genes that encode ion-channel proteins in the retrosplenial cortex. The first gene encodes a channel activated by the neurotransmitter molecule glutamate. The second encodes hyperpolarization-activated cyclic nucleotide-gated 1 (HCN1), a channel activated by cations that is sometimes called a pacemaker, because of its ability to produce rhythmic activity in the heart and neurons. Vesuna et al. found that the ketamine-induced rhythm was reduced in mice lacking either gene. However, only the HCN1 channel was needed for ketamine to elicit dissociation-like behaviours.

The complex state of dissociation can be fully described only by humans, who can report their experience. For example, a study in humans was needed to prove that the dissociative and analgesic properties of ketamine are independent9. Going forward, studies that use dissociative drugs in people will continue to be of great interest — for instance, to reveal the connection (if any) between the brain rhythm reported by Vesuna et al. and the various desirable properties of ketamine. Such studies should also include medicines, such as benzodiazepines and lamotrigine, that attenuate ketamine-induced dissociation. An improved understanding of how ketamine alters brain rhythms and associated behavioural states could eventually lead to therapeutics for people experiencing chronic pain, depression and perhaps dissociative disorders.”

Nature Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

A Computer Predicts Your Thoughts, Creating Images Based on Them

 

 

 

 

 

 

 

 

 

Researchers at the University of Helsinki have developed a technique in which a computer models visual perception by monitoring human brain signals. In a way, it is as if the computer tries to imagine what a human is thinking about. As a result of this imagining, the computer is able to produce entirely new information, such as fictional images that were never before seen.

The technique is based on a novel brain-computer interface. Previously, similar brain-computer interfaces have been able to perform one-way communication from brain to computer, such as spell individual letters or move a cursor.

As far as is known, the new study is the first where both the computer’s presentation of the information and brain signals were modelled simultaneously using artificial intelligence methods. Images that matched the visual characteristics that participants were focusing on were generated through interaction between human brain responses and a generative neural network.

Neuroadaptive generative modelling

The researchers call this method neuroadaptive generative modelling. A total of 31 volunteers participated in a study that evaluated the effectiveness of the technique. Participants were shown hundreds of AI-generated images of diverse-looking people while their EEG was recorded.

The subjects were asked to concentrate on certain features, such as faces that looked old or were smiling. While looking at a rapidly presented series of face images, the EEGs of the subjects were fed to a neural network, which inferred whether any image was detected by the brain as matching what the subjects were looking for.

Based on this information, the neural network adapted its estimation as to what kind of faces people were thinking of. Finally, the images generated by the computer were evaluated by the participants and they nearly perfectly matched with the features the participants were thinking of. The accuracy of the experiment was 83 per cent.

“The technique combines natural human responses with the computer’s ability to create new information. In the experiment, the participants were only asked to look at the computer-generated images.

The computer, in turn, modelled the images displayed and the human reaction toward the images by using human brain responses. From this, the computer can create an entirely new image that matches the user’s intention,” says Tuukka Ruotsalo, Academy of Finland Research Fellow at the University of Helsinki, Finland and Associate Professor at the University of Copenhagen, Denmark.

Unconscious attitudes may be exposed

Generating images of the human face is only one example of the technique’s potential uses. One practical benefit of the study may be that computers can augment human creativity.

“If you want to draw or illustrate something but are unable to do so, the computer may help you to achieve your goal. It could just observe the focus of attention and predict what you would like to create,” Ruotsalo says. However, the researchers believe that the technique may be used to gain understanding of perception and the underlying processes in our mind.

“The technique does not recognise thoughts but rather responds to the associations we have with mental categories. Thus, while we are not able to find out the identity of a specific ‘old person’ a participant was thinking of, we may gain an understanding of what they associate with old age. We, therefore, believe it may provide a new way of gaining insight into social, cognitive and emotional processes,” says Senior Researcher Michiel Spapé.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

September 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Reprogramming Brain Cells Enables Flexible Decision-Making

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“Humans, like other animals, have the ability to constantly adapt to new situations. Researchers at the Brain Research Institute of the University of Zurich have utilized a mouse model to reveal which neurons in the brain are in command in guiding adaptive behavior. Their new study contributes to our understanding of decision-making processes in healthy and infirm people.

Greetings without handshakes, mandatory masks in trains, sneezing into elbow crooks – the COVID-19 pandemic dramatically illustrates how important it can be for humans to shed habitual behaviors and to learn new ones. Animals, too, must be capable of rapidly adapting to changes in environmental conditions.

“The plasticity of the brain forms the foundation of this ability,” says Fritjof Helmchen, the co-director of the Brain Research Institute at the University of Zurich, who also heads the Neuroscience Center Zurich. “But the biological processes that enable this amazing feat are still poorly understood.” Helmchen’s team has now successfully taken a first step towards illuminating these processes.

Their study, just published in the scientific journal Nature, demonstrates that the orbitofrontal cortex, a region of the cerebral cortex that sits behind the eyes, is capable of reprogramming neurons located in sensory areas.

Observing brain cells in the act of relearning

In their experiments with mice, the researchers simulated a relearning process under controlled conditions and investigated what happens in the brain at the level of individual neurons during that process. The researchers first trained the animals to lick every time they touched a strip of coarse-grit sandpaper with their whiskers and rewarded the response with a drink of sucrose water.

However, the mice were not allowed to lick when they brushed their whiskers against fine-grain sandpaper; if they did, they were punished with a mild irritating noise. Once the mice understood how to perform their task, the tables were then turned. The reward was now delivered after whisking against fine-grain and not coarse-grit sandpaper. The mice quickly learned this new, opposite behavior pattern after little practice.

During the training, the neuroscientists employed molecular biological and imaging techniques to analyze the function of individual neurons in the brain cortices involved.

Their analysis revealed that a group of brain cells in the orbitofrontal cortex is particularly active during the relearning process. These cells have long axons that extend into the sensory area in mice that processes tactile stimuli. The cells in this area initially followed the old activity pattern, but some of them then adapted to the new situation. When specific neurons in the orbitofrontal cortex were deliberately inactivated, relearning was impaired and the neurons in the sensory area no longer exhibited modification in their activity.

“We were thus able to demonstrate that a direct connection from the orbitofrontal cortex to sensory areas of the brain exists and that some neurons get remapped there,” explains Helmchen.

“The plasticity of those cells and the instructions they receive from the higher-order orbitofrontal cortex appear to be crucial to behavioral flexibility and our ability to adapt to new situations.”

It has long been known that the orbitofrontal cortex is involved in decision-making processes”. It is in charge, to a certain degree, of enabling us to react appropriately and successfully to exogenous circumstances.

“But the neural circuits underlying this function were unknown until now,” says Abhishek Banerjee, lead author of the study, now an Associate Professor at Newcastle University, UK.

The researchers believe that the fundamental processes they observed in mice take place in a similar way in the human brain as well. “This deepened knowledge about complex brain processes involved in decision making is important,” explains Helmchen.

“Our research findings may contribute to a better understanding of brain disorders in which the flexibility in decision making is impaired, as it is, for example in various forms of autism and schizophrenia.” Clearly, he says, having difficulties or being unable to adapt one’s behavior poses a severe problem for affected people.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Vaccine Trial Is Halted After Patient’s Adverse Reaction

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The news about AstraZeneca’s trial came on the same day that the company and others pledged to thoroughly vet any vaccine.

Francis Collins, director of the National Institutes of Health, said the hold on the AstraZeneca trial “is a concrete example of how even a single case of unexpected illness is sufficient to hold a clinical trial in multiple countries” — and evidence that “we cannot compromise” on safety.

The pharmaceutical company AstraZeneca halted global trials of its COVID-19 vaccine because of a serious and unexpected adverse reaction in a participant, the company said.

The trial’s halt allows the British-Swedish company to conduct a safety review. How long the hold will last is unclear.

In a statement, the company described the halt as a “routine action which has to happen whenever there is a potentially unexplained illness in one of the trials, while it is investigated, ensuring we maintain the integrity of the trials.”

The news of AstraZeneca pausing its trial came the same day the company and eight other drugmakers reaffirming that they would not move forward with such products before thoroughly vetting them for safety and efficacy.

The companies did not rule out seeking an emergency authorization of their vaccines, but promised that decisions about any potential COVID-19 vaccine would be made based on “large, high-quality clinical trials” and that the companies would follow guidance from regulatory agencies like the Food and Drug Administration.

“At this stage, we don’t know if the events that triggered the hold are related to vaccination,” said Dr. Luciana Borio, who oversaw public health preparedness for the National Security Council “But it is important for them to be thoroughly investigated.”

In large trials like the ones AstraZeneca is overseeing, the company said, participants do sometimes become sick by chance, but such illnesses “must be independently reviewed to check this carefully.”

The company said it was “working to expedite the review of the single event to minimize any potential impact on the trial timeline” and that it was “committed to the safety of our participants and the highest standards of conduct in our trials.”

A person familiar with the situation, and who spoke on the condition of anonymity, said that the participant had been enrolled in a Phase 2/3 trial based in the United Kingdom. The individual also said that a volunteer in the U.K. trial had been found to have transverse myelitis, an inflammatory syndrome that affects the spinal cord and is often sparked by viral infections. However, the timing of this diagnosis, and whether it was directly linked to AstraZeneca’s vaccine, is unclear.

AstraZeneca declined to comment on the location of the participant and did not confirm the diagnosis of transverse myelitis. “The event is being investigated by an independent committee, and it is too early to conclude the specific diagnosis,” the company said.

AstraZeneca’s vaccine is currently in Phase 2/3 trials in England and India, and in Phase 3 trials in Brazil, South Africa and more than 60 sites in the United States. The company intended for its U.S. enrollment to reach 30,000.

AstraZeneca is one of three companies whose vaccines are in late-stage clinical trials in the United States.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Let Your Brain Rest: Boredom Can Be Good For Your Health

 

 

 

 

 

 

 

 

 

 

“The human brain is a powerful tool. Always on, the brain is thinking and dealing with decisions and stressors and subconscious activities. But as much as the human brain function has a large capacity, it also has limits. Alicia Walf, a neuroscientist and a senior lecturer in the Department of Cognitive Science at Rensselaer Polytechnic Institute, says it is critical for brain health to let yourself be bored from time to time.

Being bored can improve social connections. When neuroscientists do studies looking at brain activity they often compare what areas are “on” when people do a specific cognitive task and when they are told to do nothing. Remarkably, there is extensive activity in the do nothing part of the experiment. This has led social neuroscientists to discover that we have what is called a default mode network, many brain regions that are on by default, or when we are not doing other things. It also turns out that when we are not busy with other thoughts and activities, we focus inward as well as on social interactions.

Being bored can help foster creativity. Many scientists and artists have reported being inspired or solving a complex problem when they have actually stopped thinking about it. This eureka moment is called insight. Neuroscientists have shown different patterns of brain activity when people solve problems compared to by working through them step-by-step. Even the ancient Greek Archimedes is known to come up with his major finding relating to displacement of water while taking a bath.

Additionally, being bored can improve overall brain health. During exciting times, the brain releases a chemical called dopamine which is associated with feeling good. When the brain has fallen into a predictable, monotonous pattern, many people feel bored, even depressed. This might be because we have lower levels of dopamine. One approach is to retrain the brain to actually enjoy these less exciting, and perhaps boring, times. Especially when we are young, our brains are able to adapt to new ways to think and behave. “Give boredom a try and see what your brain comes up with,” says Walf.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Brains are talking to computers, and computers to brains.

 

Are our daydreams safe?

 

 

 

 

 

 

A Human Rights Issue’

 

 

 

 

 

 

 

 

 

 

 

 

“To grasp why Dr. Yuste frets so much about brain-reading technology, it helps to understand his research. He helped pioneer a technology that can read and write to the brain with unprecedented precision, and it doesn’t require surgery. But it does require genetic engineering.

Dr. Yuste infects mice with a virus that inserts two genes into the animals’ neurons. One prompts the cells to produce a protein that make them sensitive to infrared light; the other makes the neurons emit light when they activate. Thereafter, when the neurons fire, Dr. Yuste can see them light up. And he can activate neurons in turn with an infrared laser. Dr. Yuste can thus read what’s happening in the mouse brain and write to the mouse’s brain with an accuracy impossible with other techniques.

And he can, it appears, make the mice “see” things that aren’t there.

In one experiment, he trained mice to take a drink of sugar water after a series of bars appeared on a screen. He recorded which neurons in the visual cortex fired when the mice saw those bars. Then he activated those same neurons with the laser, but without showing them the actual bars. The mice had the same reaction: They took a drink.

He likens what he did to implanting an hallucination. “We were able to implant into these mice perceptions of things that they hadn’t seen,” he told me. “We manipulated the mouse like a puppet.”

This method, called optogenetics, is a long way from being used in people. To begin with, we have thicker skulls and bigger brains, making it harder for infrared light to penetrate. And from a political and regulatory standpoint, the bar is high for genetically engineering human beings. But scientists are exploring workarounds — drugs and nanoparticles that make neurons receptive to infrared light, allowing precise activation of neurons without genetic engineering.

The lesson in Dr. Yuste’s view is not that we’ll soon have lasers mounted on our heads that play us “like pianos,” but that brain-reading and possibly brain-writing technologies are fast approaching, and society isn’t prepared for them.

“We think this is a human rights issue,” he told me.

In a 2017 paper in the journal Nature, Dr. Yuste and 24 other signatories, including Dr. Gallant, called for the formulation of a human rights declaration that explicitly addressed “neurorights” and what they see as the threats posed by brain-reading technology before it becomes ubiquitous. Information taken from people’s brains should be protected like medical data, Dr. Yuste says, and not exploited for profit or worse. And just as people have the right not to self-incriminate with speech, we should have the right not to self-incriminate with information gleaned from our brains.

Dr. Yuste’s activism was prompted in part, he told me, by the large companies suddenly interested in brain-machine research.

Say you’re using your Google Cap. And like many products in the Google ecosystem, it collects information about you, which it uses to help advertisers target you with ads. Only now, it’s not harvesting your search results or your map location; it’s harvesting your thoughts, your daydreams, your desires.

Who owns those data?

Or imagine that writing to the brain is possible. And there are lower-tier versions of brain-writing gizmos that, in exchange for their free use, occasionally “make suggestions” directly to your brain. How will you know if your impulses are your own, or if an algorithm has stimulated that sudden craving for Ben & Jerry’s ice cream or Gucci handbags?

“People have been trying to manipulate each other since the beginning of time,” Dr. Yuste told me. “But there’s a line that you cross once the manipulation goes directly to the brain, because you will not be able to tell you are being manipulated.”

When I asked Facebook about concerns around the ethics of big tech entering the brain-computer interface space, Mr. Chevillet, of Facebook Reality Labs, highlighted the transparency of its brain-reading project. “This is why we’ve talked openly about our B.C.I. research — so it can be discussed throughout the neuroethics community as we collectively explore what responsible innovation looks like in this field,” he said in an email.

Ed Cutrell, a senior principal researcher at Microsoft, which also has a B.C.I. program, emphasized the importance of treating user data carefully. “There needs to be clear sense of where that information goes,” he told me. “As we are sensing more and more about people, to what extent is that information I’m collecting about you yours?”

Some find all this talk of ethics and rights, if not irrelevant, then at least premature.

Medical scientists working to help paralyzed patients, for example, are already governed by HIPAA laws, which protect patient privacy. Any new medical technology has to go through the Food and Drug Administration approval process, which includes ethical considerations.

(Ethical quandaries still arise, though, notes Dr. Kirsch. Let’s say you want to implant a sensor array in a patient suffering from locked-in syndrome. How do you get consent to conduct surgery that might change the person’s life for the better from someone who can’t communicate?)

Leigh Hochberg, a professor of engineering at Brown University and part of the BrainGate initiative, sees the companies now piling into the brain-machine space as a boon. The field needs these companies’ dynamism — and their deep pockets, he told me. Discussions about ethics are important, “but those discussions should not at any point derail the imperative to provide restorative neurotechnologies to people who could benefit from them,” he added.

Ethicists, Dr. Jepsen told me, “must also see this: The alternative would be deciding we aren’t interested in a deeper understanding of how our minds work, curing mental disease, really understanding depression, peering inside people in comas or with Alzheimer’s, and enhancing our abilities in finding new ways to communicate.”

There’s even arguably a national security imperative to plow forward. China has its own version of BrainGate. If American companies don’t pioneer this technology, some think, Chinese companies will. “People have described this as a brain arms race,” Dr. Yuste said.

Not even Dr. Gallant, who first succeeded in translating neural activity into a moving image of what another person was seeing — and who was both elated and horrified by the exercise — thinks the Luddite approach is an option. “The only way out of the technology-driven hole we’re in is more technology and science,” he told me. “That’s just a cool fact of life.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

August 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

How Your Phone Is Used to Track You, and What You Can Do About It

 

 

 

 

 

 

 

 

 

 

 

 

 

Smartphone location data, often used by marketers, has been useful for studying the spread of the COVID-19. But the information raises troubling privacy questions.

“As researchers and journalists try to understand how the COVID-19 pandemic is affecting people’s behavior, they have repeatedly relied on location information from smartphones. The data allows for an expansive look at the movements of millions of people, but it raises troublesome questions about privacy.

In several articles, The NY Times has used location data provided by a company called Cuebiq, which analyzes data for advertisers and marketers. This data comes from smartphone users who have agreed to share their locations with certain apps, such as ones that provide weather alerts or information on local gas stations. Cuebiq helps app makers use technology like GPS to determine the location of people’s phones, and in turn some of the app makers provide data to Cuebiq for it to analyze.

The data obtained by The Times is anonymized and aggregated, meaning that the journalists see broad statistics compiled by geographic area — such as the median distance moved per day by devices in a census tract. The Times did not receive information about individual phones and did not see the path any particular phone took.

About 15 million people in the United States use the relevant apps daily and allow them to track their location regularly. The aggregate data provides a representative sample of the population, according to academic papers that studied Cuebiq’s data in different metro areas.

What are the dangers of this data?

Although the data excludes names, phone numbers and other identifying information, even anonymous location information can be revealing. The Times has reported on the intrusiveness of such data, which can show intimate details like trips to doctor’s offices and outings with romantic partners.

The fact that companies are collecting, storing and selling location information about individuals at all presents risks. Hackers or people with access to raw location data could identify or follow a person without consent, by pinpointing, for example, which phone regularly spent time at that person’s home address.

Different companies have widely varying approaches to handling the information, including deleting large portions of it for privacy reasons or selling the raw data with no protections. Location data on individuals is used for purposes like marketing and analysis for hedge funds and law enforcement. There is no federal law in the United States that limits the use of location information in this way, although some have been proposed. Cuebiq said it collects and stores raw location data but does not sell it.

What are the benefits of this data?

Location data from smartphones is used for several purposes, most frequently for targeted advertising. For example, companies may show ads for sneakers to people who often go to a gym. Companies such as Apple and Google use similar information for mapping and traffic monitoring, or to tell people when stores are likely to be busy.

Makers of apps that sell the data say it allows them to give users their services without charging them money.

During the coronavirus pandemic, location information has shown where people are following social distancing rules, and where they have traveled — enabling analysis of potential hot spots. The Times has used this data to show that people from low-income areas were less likely to be able to shelter at home than people from high-income locations and to demonstrate how the virus may have spiraled out of control in the United States.

How would I know if my data is collected?

It can be difficult for people to keep track of whether and how their data is being gathered. Android-based devices and iPhones both require apps to ask users to enable location services before collecting the information, but the explanations people see when prompted to give permission are often incomplete or misleading. An app may tell users that granting access to their location will help them get weather alerts, but not mention that the data will be sold. That disclosure is often buried in a densely worded privacy policy.

In a recent test of five apps that provide information for Cuebiq’s data set, the disclosures indicated that the data would be shared for advertising and analysis, and users were directed to information on limiting that sharing. But some apps made it easier than others to stop the data collection. And in a test last year by NY Times journalists of an app that sent data to Cuebiq, the initial prompt for the user to allow access to location information did not mention all the ways it would be used. That app later changed its messaging.

Even with such disclosures, it may not be clear to users how frequently someone’s information is collected and what it can show. In Europe and California, users can request their data. Elsewhere, policies vary by company.

You can request your data from Cuebiq or ask the company to delete your data regardless of where you live. Cuebiq ties your data to your phone’s so-called advertising ID, which is used by marketers and others to differentiate phones from each other, and will send you the information associated with that ID. To prevent people from getting data on others’ IDs, the company requires you to download an app that verifies the number and then makes the request. You can then delete the app without affecting your request.

How can I opt out?

If you want to prevent Cuebiq from collecting your data, the easiest way is to disable the advertising ID on your phone. If you disable it, Cuebiq will no longer keep track of your device.

Cuebiq also provides several other ways to opt out of location tracking, outlined if you click “Control” on the company’s privacy page.

However, opting out of Cuebiq’s database will not prevent your information from being collected by a variety of other companies that gather and store precise location information. Some provide similar options, but not all do, and it is difficult to keep track of the myriad firms in the location-tracking industry.

If you want to avoid collection of your location data altogether, your best bet is to evaluate the individual apps on your phone to see whether they are collecting more about you than you would like. Prevent all but your most important apps from gaining access to the data, and allow them to get it only when you are using the app.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Adaptation in Single Neurons Provides Memory for Language Processing

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“Did the man bite the dog, or was it the other way around? When processing an utterance, words need to be assembled into the correct interpretation within working memory. One aspect of comprehension is to establish ‘who did what to whom’. This process of unification takes much longer than basic events in neurobiology, like neuronal spikes or synaptic signaling.  Fitz, lead investigator at the Neurocomputational Models of Language group at the Max Planck Institute for Psycholinguistics, and his colleagues propose an account where adaptive features of single neurons supply memory that is sufficiently long-lived to bridge this temporal gap and support language processing.

Together with researchers Uhlmann, van den Broek, Hagoort, Magnus Petersson (all Max Planck Institute for Psycholinguistics) and Duarte (Jülich Research Centre), Fitz studied working memory in spiking networks through an innovative combination of experimental language research with methods from computational neuroscience.

In a sentence comprehension task, circuits of biological neurons and synapses were exposed to sequential language input which they had to map onto semantic relations that characterize the meaning of an utterance. For example, ‘the cat chases a dog’ means something different than ‘the cat is chased by a dog’ even though both sentences contain similar words. The various cues to meaning need to be integrated within working memory to derive the correct message. The researchers varied the neurobiological features in computationally simulated networks and compared the performance of different versions of the model. This allowed them to pinpoint which of these features implemented the memory capacity required for sentence comprehension.

They found that working memory for language processing can be provided by the down-regulation of neuronal excitability in response to external input. “This suggests that working memory could reside within single neurons, which contrasts with other theories where memory is either due to short-term synaptic changes or arises from network connectivity and excitatory feedback”, says Fitz.

Their model shows that this neuronal memory is context-dependent, and sensitive to serial order which makes it ideally suitable for language. Additionally, the model was able to establish binding relations between words and semantic roles with high accuracy.

“It is crucial to try and build language models that are directly grounded in basic neurobiological principles,” declares Fitz. “This work shows that we can meaningfully study language at the neurobiological level of explanation, using a causal modelling approach that may eventually allow us to develop a computational neurobiology of language.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Body Weight Has Surprising and Alarming Impact on Brain Function

 

 

 

 

 

 

 

 

 

 

 

 

 

“One of the largest studies linking obesity with brain dysfunction, scientists analyzed over 35,000 functional neuroimaging scans using single-photon emission computerized tomography (SPECT) from more than 17,000 individuals to measure blood flow and brain activity. Low cerebral blood flow is the #1 brain imaging predictor that a person will develop Alzheimer’s disease. It is also associated with depression, ADHD, bipolar disorder, schizophrenia, traumatic brain injury, addiction, suicide, and other conditions.

“This study shows that being overweight or obese seriously impacts brain activity and increases the risk for Alzheimer’s disease as well as many other psychiatric and cognitive conditions,” explained Daniel G. Amen, MD, the study’s lead author and founder of Amen Clinics, one of the leading brain-centered mental health clinics in the United States.

Striking patterns of progressively reduced blood flow were found in virtually all regions of the brain across categories of underweight, normal weight, overweight, obesity, and morbid obesity. These were noted while participants were in a resting state as well as while performing a concentration task. In particular, brain areas noted to be vulnerable to Alzheimer’s disease, the temporal and parietal lobes, hippocampus, posterior cingulate gyrus, and precuneus, were found to have reduced blood flow along the spectrum of weight classification from normal weight to overweight, obese, and morbidly obese.

Considering the latest statistics showing that 72% of Americans are overweight of whom 42% are obese, this is distressing news for America’s mental and cognitive health.

Commenting on this study, George Perry, PhD, Editor-in-Chief of the Journal of Alzheimer’s Disease and Semmes Foundation Distinguished University Chair in Neurobiology at The University of Texas at San Antonio, stated, “Acceptance that Alzheimer’s disease is a lifestyle disease, little different from other age-related diseases, that is the sum of a lifetime is the most important breakthrough of the decade. Dr. Amen and collaborators provide compelling evidence that obesity alters blood supply to the brain to shrink the brain and promote Alzheimer’s disease. This is a major advance because it directly demonstrates how the brain responds to our body.”

This study highlights the need to address obesity as a target for interventions designed to improve brain function, be they Alzheimer disease prevention initiatives or attempts to optimize cognition in younger populations. Such work will be crucial in improving outcomes across all age groups.

Although the results of this study are deeply concerning, there is hope. Dr. Amen added, “One of the most important lessons we have learned through 30 years of performing functional brain imaging studies is that brains can be improved when you put them in a healing environment by adopting brain-healthy habits, such as a healthy calorie-smart diet and regular exercise.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Bacteria in the Gut Have a Direct Line to the Brain

 

 

 

 

 

 

 

 

 

 

 

 

The findings shed light on the potential mechanisms behind neurological abnormalities and intestinal diseases, including IBS.

Summary: Sensory neurons that send signals from the intestines to the brain stem extend to the interface of areas of the intestine that are exposed to high levels of microbial compounds. Turning off the neurons, researchers observed activated neurons in the brainstem as well as activation of gut neurons that control intestinal motility. The findings shed light on the potential mechanisms behind neurological abnormalities and intestinal diseases, including IBS.

With its 100 million neurons, the gut has earned a reputation as the body’s “second brain”—corresponding with the real brain to manage things like intestinal muscle activity and enzyme secretions. A growing community of scientists are now seeking to understand how gut neurons interact with their brain counterparts, and how failures in this process may lead to disease.

Now, new research shows that gut bacteria play a direct role in these neuronal communications, determining the pace of intestinal motility. The research, conducted in mice and published in Nature, suggests a remarkable degree of communication between our nervous system and the microbiota. It may also have implications for treating gastrointestinal conditions.

“We describe how microbes can regulate a neuronal circuit that starts in the gut, goes to the brain, and comes back to the gut,” says Daniel Mucida, associate professor and head of the Laboratory of Mucosal Immunology. “Some of the neurons within this circuit are associated with irritable bowel syndrome, so it is possible that dysregulation of this circuit predisposes to IBS.”

The work was led by Paul A. Muller, a former graduate student in the Mucida lab.

How microbes control motility

To understand how the central nervous system senses microbes within the intestines, Mucida and his colleagues analyzed gut-connected neurons in mice that lacked microbes entirely, so-called germ-free mice that are raised from birth in an isolated environment, and given only food and water that has been thoroughly sterilized. They found that some gut-connected neurons are more active in the germ-free mice than in controls and express high levels of a gene called cFos, which is a marker for neuronal activity.

This increase in neuronal activity, in turn, causes food to move more slowly than usual through the digestive tract of the mice. When the researchers treated the germ-free mice with a drug that silences these gut neurons, they saw intestinal motility speed up.

It’s unclear how the neurons sense the presence of gut microbes, but Mucida and his colleagues found hints that the key may be a set of compounds known as short-chain fatty acids, which are made by gut bacteria. They found that lower levels of these fatty acids within the guts of mice were associated with greater activity of the gut-connected neurons. And when they boosted the animal’s gut levels of these compounds, the activity of their gut neurons decreased. Other microbial compounds and gut hormones that change with the microbiota were also found to regulate neuronal activity, suggesting additional players in this circuit.

Neurons in control

Further experiments revealed a conundrum, however. The scientists saw that the particular type of gut-connected neurons activated by the absence of microbes did not extend to the exposed surface of the intestines, suggesting that they cannot sense the fatty acid levels directly.

So Mucida and his colleagues decided to trace the circuit backwards and found a set of brainstem neurons that show increased activity in the germ-free mice. When the researchers manipulated control mice to specifically activate these same neurons, they saw an increase in the activity of the gut neurons and a decrease in intestinal motility.

The researchers continued to work backwards, next focusing their attention on the sensory neurons that send signals from the intestines to the brainstem. Their experiments revealed these sensory neurons extended to the interface of areas of the intestine that are exposed to high-levels of microbial compounds, including fatty acids. They turned off these neurons, to mimic what happens in germ-free mice that lack the fatty acids, or associated gut signals, and observed activated neurons in the brainstem, as well as activation of the gut neurons that control intestinal motility.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

July 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

A Future Without Cars Is Amazing

 

 

 

 

 

 

 

 

 

 

 

 

Why do American cities waste so much space on cars?

“As Covid-19 lockdowns crept across the globe this winter and spring, an unusual sound fell over the world’s metropolises: the hush of streets that were suddenly, blessedly free of cars. City dwellers reported hearing bird songs, wind and the rustling of leaves.

You could smell the absence of cars, too.

Cars took a break from killing people, too. About 10 pedestrians die on New York City’s streets in an ordinary month. Under lockdown, the city went a record two months without a single pedestrian fatalities. In California, vehicle collisions plummeted 50 percent, reducing accidents resulting in injuries or death by about 6,000 per month.

But there is a catch: Cities are beginning to cautiously open back up again, and people are wondering how they’re going to get into work. Many are worried about the spread of the virus on public transit. Are cars our only option? How will we find space for all of them?

In much of Manhattan, the average speed of traffic before the pandemic had fallen to 7 miles per hour. In Midtown, it was less than 5 m.p.h. That’s only slightly faster than walking and slower than riding a bike. Will traffic soon be worse than ever?

Not if we choose another path.

If riders wear face masks — and if there are enough subway cars, buses, bike lanes and pedestrian paths for people to avoid intense overcrowding — transit might be no less safe than cars, in terms of the risk of the spread of disease. In all other measures of safety, transit is far safer than cars.

What’s that you say? There aren’t enough buses in your city to avoid overcrowding, and they’re too slow, anyway? Pedestrian space is already hard to find? Well, right. That’s car dependency. And it’s exactly why cities need to plan for a future of fewer cars, a future in which owning an automobile, even an electric one, is neither the only way nor the best way to get around town.

Automobiles are not just dangerous and bad for the environment, they are also profoundly wasteful of the land around us: Cars take up way too much physical space to transport too few people. It’s geometry.

In most American cities, wherever you look landscape constructed primarily for the movement and storage of automobiles, not the enjoyment of people: endless wide boulevards and freeways for cars to move swiftly; each road lined with parking spaces for cars at rest; retail establishments ringed with spots for cars; houses built around garages for cars; and a gas station, for cars to feed, on every other corner.

In the most car-dependent cities, the amount of space devoted to automobiles reaches truly ridiculous levels. In Los Angeles, for instance, land for parking exceeds the entire land area of Manhattan enough space to house almost a million more people at Los Angeles’s prevailing density.

This isn’t a big deal in the parts of America where space is seemingly endless. But in the most populated cities, physical space is just about the most precious resource there is. The land value of Manhattan alone is estimated to top $1.7 trillion. Why are we giving so much of it to cars?

Without cars, Manhattan’s streets could give priority to more equitable and accessible ways of getting around, including an extensive system of bike “superhighways” and bus rapid transit- a bus system with dedicated lanes in the roadway, creating a service that approaches the capacity, speed and efficiency of the subway, at a fraction of the cost.

Eliminating most cars in Manhattan would also significantly clean up the air for the entire region. It would free up space for new housing and create hundreds of acres of new parks and pedestrian promenades, improving the fundamental health, beauty and livability of America’s largest metropolis.

Yet when I got my speedy ride, I quickly realized it was kind of pointless, because most of the time there’s too much traffic where I live to go any faster than a golf cart. This is the drab reality of driving you’ll never see in car ads — a daily, rage-inducing grind of traffic, parking, and shelling out to fill up, an option that many people choose not for any love affair with cars, but often because driving is the least-inconvenient way of getting around where they live and work.

I’ve grown increasingly disillusioned about America’s tolerance for the public health and environmental damage caused by cars, not to mention the frustrations of commuting by car. And I’m losing hope that the car industry will be able to fix them anytime soon.

I’ve spent much of the last decade watching Silicon Valley take on that industry, and I once had great expectations that techies would soon make cars substantially cleaner, safer, more efficient, more convenient and cheaper to operate.

But many of their innovations are turning into a bust — or, at the very least, are not making enough of a difference. Uber and Lyft once promised to reduce traffic through car-pooling.

Tesla turned the electric car into a mainstream object of lust — but most of the rest of the auto industry is struggling to get consumers to switch over from gas so it could take years to electrify America’s entire fleet.

And cars take up space even while they’re not in use. They need to be parked, which consumes yet more space on the sides of streets or in garages.

Add it all up and you get a huge number: In addition to the 2,450 acres of roadway in Manhattan, nearly 1,000 more acres — an area about the size of Central Park — is occupied by parking garages, gas stations, carwashes, car dealerships and auto repair shops. There is three times more roadway for cars on Manhattan as there is for bikes. There’s more road for cars than there is sidewalks for pedestrians.

How would people get around in a Manhattan without private cars?

Mostly on foot, by bus or by subway; often on a bicycle, e-bike, scooter, or some future light, battery-powered “micromobility” device; sometimes, in a pinch, in a taxi or Uber.

Some of these may not sound like your cup of tea. Buses are slow, bicycles are dangerous, and you wouldn’t be caught dead on a scooter, let alone a one-wheeled skateboard. But that’s only because you’re imagining these other ways of getting around as they exist today, in the world of cars.

Cars make every other form of transportation a little bit terrible. The absence of cars, then, exerts its own kind of magic — take private cars away, and every other way of getting around gets much better.

Given these threats, how can American cities continue to justify wasting such enormous tracts of land on death machines?” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Ennio Morricone, Oscar-Winning Composer of Film Scores, Dies at 91

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

His vast output included atmospheric music for spaghetti westerns in his native Italy and scores for some 500 movies by a Who’s Who of directors.

“Ennio Morricone, the Italian composer whose atmospheric scores for spaghetti westerns and some 500 films by a Who’s Who of international directors made him one of the world’s most versatile and influential creators of music for the modern cinema, died on Monday in Rome. He was 91.

His death, at a hospital, was confirmed by his lawyer, Giorgio Assumma, who said that Mr. Morricone was admitted there last week after falling and fracturing a femur. Mr. Assumma also distributed a statement that Mr. Morricone had written himself, titled: “I, Ennio Morricone, am dead.”

To many cineastes, Maestro Morricone (pronounced (mo-ree-CONE-eh) was a unique talent, composing melodic accompaniments to comedies, thrillers and historical dramas by Bernardo Bertolucci, Pier Paolo Pasolini, Terrence Malick, Roland Joffé, Brian De Palma, Barry Levinson, Mike Nichols, John Carpenter, Quentin Tarantino and other filmmakers.

He scored many popular films of the past 40 years: La Piovra (The Octopus), Mr. Carpenter’s “The Thing” (1982), Mr. De Palma’s “The Untouchables” (1987), Roman Polanski’s “Frantic” (1988), Giuseppe Tornatore’s “Cinema Paradiso” (1988), Wolfgang Petersen’s “In the Line of Fire” (1993), and Mr. Tarantino’s “The Hateful Eight” (2015).

Mr. Morricone won his first competitive Academy Award for his score for “The Hateful Eight” an American western mystery thriller for which he also won a Golden Globe. In a career showered with honors, he had previously won an Oscar for lifetime achievement (2007) and was nominated for five other Academy Awards; in addition, he won two Golden Globes, four Grammys and dozens of international awards.

But the work that made him world famous, and that was best known to moviegoers, was his blend of music and sound effects for Sergio Leone’s so-called spaghetti westerns of the 1960s: a ticking pocket watch, a sign creaking in the wind, buzzing flies, a twanging Jew’s harp, haunting whistles, cracking whips, gunshots and a bizarre, wailing “ah-ee-ah-ee-ah,” played on a sweet potato-shaped wind instrument called an ocarina.

Imitated, scorned, spoofed, what came to be known as “The Dollars Trilogy” —A Fistful of Dollars (1964), For a Few Dollars More (1965) and “The Good, the Bad and the Ugly” (1966), all released in the United States in 1967 — starred Clint Eastwood as “The Man With No Name” and were enormous hits, with a combined budget of $2 million and gross worldwide receipts of $280 million.

The trilogy’s Italian dialogue was dubbed for the English-speaking market, and the action was brooding and slow, with clichéd close-ups of gunfighters’ eyes. But Mr. Morricone, breaking the unwritten rule never to upstage actors with music, infused it all with wry sonic weirdness and melodramatic strains that many fans embraced with cultlike devotion and that critics called viscerally true to Mr. Leone’s vision of the Old West.

“In the films that established his reputation in the 1960s, the series of spaghetti westerns he scored for Mr. Leone, Mr. Morricone’s music is anything but a backdrop,” The New York Times critic Jon Pareles wrote in 2007. “It’s sometimes a conspirator, sometimes a lampoon, with tunes that are as vividly in the foreground as any of the actors’ faces.”

Mr. Morricone also scored Mr. Leone’s Once Upon a Time in the West (1968) and his Jewish gangster drama, “Once Upon a Time in America” (1984), both widely considered masterpieces. But he became most closely identified with “The Dollars Trilogy,” and in time grew weary of answering for their lowbrow sensibilities.

Asked by The Guardian in 2006 why “A Fistful of Dollars” had made such an impact, he said: “I don’t know. It’s the worst film Leone made and the worst score I did.”The Ecstasy of Gold, the theme song for “The Good, the Bad and the Ugly,” was one of Mr. Morricone’s biggest hits.It was recorded by the cellist Yo Ma on a 2004 album of Mr. Morricone’s compositions and used in concert by two rock bands: as closing music for the Ramones and the introductory theme for Metallica.

Mr. Morricone looked professorial in bow ties and spectacles, with wisps of flyaway white hair. He sometimes holed up in his palazzo in Rome and wrote music for weeks on end, composing not at a piano but at a desk. He heard the music in his mind, he said, and wrote it in pencil on score paper for all orchestra parts.

He sometimes scored 20 or more films a year, often working only from a script before screening the rushes. Directors marveled at his range — tarantellas, psychedelic screeches, swelling love themes, tense passages of high drama, stately evocations of the 18th century or eerie dissonances of the 20th — and at the ingenuity of his silences: He was wary of too much music, of overloading an audience with emotions.

Mr. Morricone composed for television films and series, (some of his music was reused on “The Sopranos” and “The Simpsons”), wrote about 100 concert pieces, and orchestrated music for popular singers, including Joan Baez, Paul Anka and Anna Maria Quaini, the Italian star known as Mina.

Mr. Morricone never learned to speak English, never left Rome to compose, and for years refused to fly anywhere, though he eventually flew all over the world to conduct orchestras, sometimes performing his own compositions. While he wrote extensively for Hollywood, he did not visit the United States until 2007, when, at 78, he made a monthlong tour, punctuated by festivals of his films.

He gave concerts in New York at Radio City Music Hall and the United Nations, and he concluded the tour in Los Angeles, where he received an honorary Academy Award for lifetime achievement. The presenter, Clint Eastwood, roughly translated his acceptance speech from the Italian as the composer expressed “deep gratitude to all the directors who had faith in me.”

Ennio Morricone was born in Rome on Nov. 10, 1928, one of five children of Mario and Libera (Ridolfi) Morricone. His father, a trumpet player, taught him to read music and play various instruments. Ennio wrote his first compositions at 6. In 1940, he entered the National Academy of Santa Cecilia, where he studied trumpet, composition and direction.

His World War II experiences — hunger and the dangers of Rome as an “open city” under German and American armies — were reflected in some of his later work. After the war, he wrote music for radio; for Italy’s broadcasting service, RAI; and for singers under contract to RCA.

Mr. Morricone’s survivors include his wife, Maria Travia, whom he married in 1956 and cited when accepting his 2016 Oscar; four children, Marco, Alessandra, Andrea (a composer and conductor) and Giovanni; and four grandchildren.

Mr. Morricone’s first film credit was for Luciano Salce’s comedy “The Fascist” (1961). He soon began his collaboration with Mr. Leone, a former schoolmate. But he also scored political films: Gillo Pontecorvo’s “The Battle of Algiers” (1966), Mr. Pasolini’s “The Hawks and the Sparrows” (1966), Giuliano Montaldo’s “Sacco and Vanzetti” (1971) and Mr. Bertolucci’s “1900” (1976).

Five Morricone scores nominated for Oscars displayed his virtuosity. In Mr. Malick’s “Days of Heaven” (1978), he captured a love triangle in the Texas Panhandle, circa 1916. For “The Mission” (1986), about an 18th-century Jesuit priest (Jeremy Irons) in the Brazilian rain forest, he wove the panpipe music of Indigenous people with that of a missionary party’s European instruments, playing out the cultural conflicts.

In “The Untouchables,” his music pounded out the struggle between Eliot Ness (Kevin Costner) and Al Capone (Robert De Niro) in Prohibition-era Chicago. In Mr. Levinson’s “Bugsy” (1991), about the mobster Bugsy Siegel (Warren Beatty), it was a medley for a star-struck sociopath in Hollywood. And in Mr. Tornatore’s “Malèna” (2000), he orchestrated the ordeals of a wartime Sicilian town as seen through the eyes of a boy obsessed with a beautiful lady.

Talking to Mr. Pareles, Mr. Morricone, a devout Roman Catholic, placed his acclaimed oeuvre in a modest perspective. “The notion that I am a composer who writes a lot of things is true on one hand and not true on the other hand,” he said. “Maybe my time is better organized than many other people’s. But compared to classical composers like Bach, Frescobaldi, Palestrina or Mozart, I would define myself as unemployed.”

NY Times,

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

DNA Linked to Covid-19 Was Inherited From Neanderthals, Study Finds

 

 

 

 

 

 

 

 

 

 

 

 

The stretch of six genes seems to increase the risk of severe illness from the coronavirus.

“A stretch of DNA linked to COVID-19 was passed down from Neanderthals 60,000 years ago, according to a new study.

Scientists don’t yet know why this particular segment increases the risk of severe illness from the coronavirus. But the new findings, which were posted online on Friday and have not yet been published in a scientific journal, show how some clues to modern health stem from ancient history.

“This interbreeding effect that happened 60,000 years ago is still having an impact today,” said Joshua Akey, a geneticist at Princeton University who was not involved in the new study.

This piece of the genome, which spans six genes on Chromosome 3, has had a puzzling journey through human history, the study found. The variant is now common in Bangladesh, where 63 percent of people carry at least one copy. Across all of South Asia, almost one-third of people have inherited the segment.

Elsewhere, however, the segment is far less common. Only 8 percent of Europeans carry it, and just 4 percent have it in East Asia. It is almost completely absent in Africa.

It’s not clear what evolutionary pattern produced this distribution over the past 60,000 years. “That’s the $10,000 question,” said Hugo Zeberg, a geneticist at the Karolinska Institute in Sweden who was one of the authors of the new study.

One possibility is that the Neanderthal version is harmful and has been getting rarer over all. It’s also possible that the segment improved people’s health in South Asia, perhaps providing a strong immune response to viruses in the region.

“One should stress that at this point this is pure speculation,” said Dr. Zeberg’s co-author, Svante Paabo, the director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Researchers are only beginning to understand why Covid-19 is more dangerous for some people than others. Older people are more likely to become severely ill than younger ones. Men are at more risk than women.

Social inequality matters, too. In the United States, Black people are far more likely than white people to become severely ill from the coronavirus; it has left Black people with a high rate of chronic diseases such as diabetes, as well as living conditions and jobs that may increase exposure to the virus.

Genes play a role as well. Last month, researchers compared people in Italy and Spain who became very sick with Covid-19 to those who had only mild infections. They found two places in the genome associated with a greater risk. One is on Chromosome 9 and includes ABO, a gene that determines blood type. The other is the Neanderthal segment on Chromosome 3.

But these genetic findings are being rapidly updated as more people infected with the coronavirus are studied. Just last week, an international group of scientists called the COVID-19 Host Genetics Initiative released a new set of data downplaying the risk of blood type. “The jury is still out on ABO,” said Mark Daly, a geneticist at Harvard Medical School who is a member of the initiative.

The new data showed an even stronger link between the disease and the Chromosome 3 segment. People who carry two copies of the variant are three times more likely to suffer from severe illness than people who do not.

After the new batch of data came out on Monday, Dr. Zeberg decided to find out if the Chromosome 3 segment was passed down from Neanderthals.

About 60,000 years ago, some ancestors of modern humans expanded out of Africa and swept across Europe, Asia and Australia. These people encountered Neanderthals and interbred. Once Neanderthal DNA entered our gene pool, it spread down through the generations, long after Neanderthals became extinct.

Most Neanderthal genes turned out to be harmful to modern humans. They may have been a burden on people’s health or made it harder to have children. As a result, Neanderthal genes became rarer, and many disappeared from our gene pool.

But some genes appear to have provided an evolutionary edge and have become quite common. In May, Dr. Zeberg, Dr. Paabo and Dr. Janet Kelso, also of the Max Planck Institute, discovered that one-third of European women have a Neanderthal hormone receptor. It is associated with increased fertility and fewer miscarriages.

Dr. Zeberg knew that other Neanderthal genes that are common today even help us fight viruses. When modern humans expanded into Asia and Europe, they may have encountered new viruses against which Neanderthals had already evolved defenses. We have held onto those genes ever since.

Dr. Zeberg looked at Chromosome 3 in an online database of Neanderthal genomes. He found that the version that raises people’s risk of severe Covid-19 is the same version found in a Neanderthal who lived in Croatia 50,000 years ago. “I texted Svante immediately,” Dr. Zeberg said in an interview, referring to Dr. Paabo.

Dr. Paabo was on vacation in a cottage in the remote Swedish countryside. Dr. Zeberg showed up the next day, and they worked day and night until they posted the study online on Friday.

“It’s the most crazy vacation I’ve ever had in this cottage,” Dr. Paabo said.

Tony Capra, a geneticist at Vanderbilt University who was not involved in the study, thought it was plausible that the Neanderthal chunk of DNA originally provided a benefit — perhaps even against other viruses. “But that was 40,000 years ago, and here we are now,” he said.

It’s possible that an immune response that worked against ancient viruses has ended up overreacting against the new coronavirus. People who develop severe cases of Covid-19 typically do so because their immune systems launch uncontrolled attacks that end up scarring their lungs and causing inflammation.

Dr. Paabo said the DNA segment may account in part for why people of Bangladeshi descent are dying at a high rate of Covid-19 in the United Kingdom.

It’s an open question whether this Neanderthal segment continues to keep a strong link to Covid-19 as Dr. Zeberg and other researchers study more patients. And it may take discoveries of the segment in ancient fossils of modern humans to understand why it became so common in some places but not others.

But Dr. Zeberg said that the 60,000-year journey of this chunk of DNA in our species might help explain why it’s so dangerous today.

“Its evolutionary history may give us some clues,” Dr. Zeberg said.”

239 scientists in 32 countries have outlined the evidence showing that smaller particles can infect people.

Whether carried aloft by large droplets that zoom through the air after a sneeze, or by much smaller exhaled droplets that may glide the length of a room, these experts said, the coronavirus is borne through air and can infect people when inhaled.

“We’ve known since 1946 that coughing and talking generate aerosols,” said Linsey Marr, an expert in airborne transmission of viruses at Virginia Tech.

In most buildings, she said, “the air-exchange rate is usually much lower, allowing virus to accumulate in the air and pose a greater risk.”

“We have this notion that airborne transmission means droplets hanging in the air capable of infecting you many hours later, drifting down streets, through letter boxes and finding their way into homes everywhere,” Dr. Hanage said.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The human brain tracks speech more closely in time than other sounds

 

 

 

 

 

 

 

 

 

 

“Humans can effortlessly recognize and react to natural sounds and are especially tuned to speech. There have been several studies aimed to localize and understand the speech-specific parts of the brain, but as the same brain areas are mostly active for all sounds, it has remained unclear whether or not the brain has unique processes for speech processing, and how it performs these processes. One of the main challenges has been to describe how the brain matches highly variable acoustic signals to linguistic representations when there is no one-to-one correspondence between the two, e.g. how the brain identifies the same words spoken by very different speakers and dialects as the same.

For this latest study, the researchers, led by Professor Rita Salmelin, decoded and reconstructed spoken words from millisecond-scale brain recordings in 16 healthy Finnish volunteers. They adopted the novel approach of using the natural acoustic variability of a large variety of sounds (words spoken by different speakers, environmental sounds from many categories) and mapping them to magnetoencephalography (MEG) data using physiologically-inspired machine-learning models. These types of models, with time-resolved and time-averaged representations of the sounds, have been used in brain research before. The novel, scalable formulation by co-lead author Ali Faisal allowed for applying such models to whole-brain recordings, and this study is the first to compare the same models for speech and other sounds.

Aalto researcher and lead author Anni Nora says, ‘We discovered that time-locking of the cortical activation to the unfolding speech input is crucial for the encoding of speech. When we hear a word, e.g. “cat”, our brain has to follow it very accurately in time to be able to understand its meaning’.

As a contrast, time-locking was not highlighted in cortical processing of non-speech environmental sounds that conveyed the same meanings as the spoken words, such as music or laughter. Instead, time-averaged analysis is sufficient to reach their meanings. ‘This means that the same representations (how a cat looks like, what it does, how it feels etc.) are accessed by the brain also when you hear a cat meowing, but the sound itself is analyzed as a whole, without need for similar time-locking of brain activation’, Nora explains.

Time-locked encoding was also observed for meaningless new words. However, even responses to human-made non-speech sounds such as laughing didn’t show improved decoding with the dynamic time-locked mechanism and were better reconstructed using a time-averaged model, suggesting that time-locked encoding is special for sounds identified as speech.

Results indicate that brain responses follow speech with especially high temporal fidelity

The current results suggest that, in humans, a special time-locked encoding mechanism might have evolved for speech. Based on other studies, this processing mechanism seems to be tuned to the native language with extensive exposure to the language environment during early development.

The present finding of time-locked encoding, especially for speech, deepens the understanding of the computations required for mapping between acoustic and linguistic representations (from sounds to words). The current findings raise the question of what specific aspects within sounds are crucial for cueing the brain into using this special mode of encoding. To investigate this further, the researchers aim next to use real-life like auditory environments such as overlapping environmental sounds and speech. ‘Future studies should also determine whether similar time-locking might be observed with specialization in processing other sounds through experience, e.g. for instrumental sounds in musicians’, Nora says

Future work could investigate the contribution of different properties within speech acoustics and the possible effect of an experimental task to boost the use of time-locked or time-averaged mode in sound processing. These machine learning models could also be very useful when applied to clinical groups, such as investigating individuals with impaired speech processing.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

June 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Plans for coronavirus immunity passports should worry us all

 

 

 

 

 

 

 

 

“If you’ve had Covid-19, an immunity passport could be your ticket out of lockdown — but even companies designing such systems aren’t sure it’s an idea that will or should ever be widely used.

An immunity passport or health certificate is a way of proving to others — your boss, an airline, or a bouncer at a bar — that you have antibodies against the coronavirus that causes Covid-19. It could be a piece of paper, a QR code or a colour code on an app, but regardless of the format, it aims to show you aren’t at risk of spreading infection.

However, the World Health Organisation (WHO) warned against immunity passports on the grounds that we don’t yet know how immunity works with Covid-19. It seems that having had the disease means it’s likely that you won’t catch it again, but it remains unclear exactly how long that protection lasts or how strong it will be. The science is still, distinctly, undecided. Plus, we don’t yet have readily available, proven antibody tests. WHO has said very clearly that the science is not there,” says Imogen Parker, head of policy at the Ada Lovelace Institute.

The idea of a digital immunity passport or certificate also raises security and privacy concerns, not dissimilar to those around contact tracing apps. And the system raises the spectre of a two-tier society, with those thought to be immune carrying on with life as normal while the rest remain in various states of lockdown.

Despite such challenges and warnings, variations on an immunity certificate or passport are already being used in Chile, while Italy and Germany are considering the idea. The UK is actively interested, with health secretary Matt Hancock saying the government was working on “systems of certification” – though immunity must be better understood and testing sorted first.

Plenty of tech firms have leapt into the fray to offer digital versions, and Estonia is trialling a system called immunity Passport developed by a group called Back to Work, led by Transferwise founder Taavet Hinrikus. As the country loosened its lockdown, it sought tools to aid that process, especially for employers to know their staff had been tested. Estonia already has digital identities for all of its citizens, so the project was linked to that system. “The idea was they could act on data, rather than guessing about symptoms,” says Harsh Sinha, the CTO of Transferwise who also worked on the project.

Sinha says the work started amid wider discussions of antibody tests, with colleagues wondering if there was a way to certify tests and show that evidence to someone else via a smartphone. Certified tests are logged in the system, and users can share their status via a temporary QR code. Once scanned, it shows the necessary health information and a photo of the user for identification. The aim was to give frontline workers more confidence that work was safe, as well as to let people who had recovered from the disease help care for older relatives.

A team of half a dozen colleagues worked evenings and weekends to cobble together a system, knowing that governments would struggle to move fast. “We wanted to build a quick, iterative prototype,” says Sinha. It’s open source and generic enough to be adapted to scientific evidence as it emerges. While working on the project, it became clear that antibody testing was questionable, suggesting immunity passports may never prove useful.

“We were clear that the probability of this seeing the light of day was less than one per cent,” Sinha says. That’s because without readily available, reliable, antibody testing, such an app has little use. “The data capture has to be there from the test,” Sinha says. “Without that the tech is as good as nothing.”

Even with testing, as we still don’t know how immunity works, there’s little point in a wider rollout, says Hinrikus. “Until that becomes clear, there is no point in expanding the pilot,” says Hinrikus. “We should be clear about that, we are building this to save time in the future… Once we have agreement around immunity and availability of widespread and cheap testing, then we could roll it out more widely. But we’re not suggesting that based on the current low quality tests that we should actually be using this.”

Beyond the science, there’s another hurdle to immunity passports: privacy. These tools will hold health data that’s key to work and social life, so keeping that secure and private is necessary. The ImmunityPassport system created by Transferwise staff is held centrally and – by necessity – not anonymised, but it holds a limited amount of personal data: your name, photo and status.

There is a decentralised version in the works, created by researchers at the Turing Institute. Their Secure AntiBody Certificate (SecureABC) system lets a healthcare or test provider issue a digital or paper certificate confirming a positive antibodies test, says Chris Hicks, one of the lead researchers on the project.” Wired

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

6,000 Strains of Bacteria Under 1 Roof

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“In the winter of 1915, Pvt. Ernest Cable arrived at the Number 14 Stationary Hospital in Wimereux, France, in bad shape. The British army’s soldiers stationed on the Western Front of World War I were being ravaged by a variety of microscopic enemies. For Private Cable, it was Shigella flexneri, the bacterium that causes dysentery.

A military bacteriologist named Lt. William Broughton-Alcock took a sample of S. flexneri from Private Cable’s body after he died on March 13, 1915. It was likely kept alive in agar, sealed under paraffin wax, and was eventually renamed NCTC 1 when it became the very first specimen added to Britain’s National Collection of Type Cultures, the oldest library of human bacterial pathogens in the world devoted to sharing strains with other scientists. The collection turned 100 this year.

Managed by Public Health England, the N.C.T.C. holds about 6,000 bacterial strains representing more than 900 species that can infect, sicken, maim and kill us. (Strains are genetic variants of a species.) Of the nearly 800 registered culture collections in 78 countries, it’s one of only a few dedicated to clinically relevant bacteria — that is, to species that make us sick.

About half of the microorganisms in the world’s culture collections are bacteria, dwarfing the number of viruses and fungi. While many scientists today are focused on fighting the spread of the novel coronavirus, bacteria continue to outmaneuver our immune systems and antibiotics. We think of them as invaders in our world, but really, we live in theirs.

On any possible, reasonable or fair criterion,”wrote Stephen Jay Gould, the evolutionary biologist “bacteria are — and always have been — the dominant forms of life on Earth.”

The collection supplies many of the world’s clinical microbiologists with authenticated microbial strains of known origin. These scientists study how bacteria evolve, test safety protocols for infectious pathogens, develop vaccines, anticancer drugs and treatments for metabolic diseases, and study the ever increasing problem of antimicrobial resistance. Private Cable’s killer, for instance, was brought back to life from its freeze-dried form by Kate Baker, a microbiologist at the University of Liverpool, and her colleagues, part of an effort to understand how S. flexneri has evolved over the past centurystill kills 164,000 people every year, most of them children.

The team sequenced the NCTC 1’s genome and then compared it with other strains isolated in 1954, 1984 and 2002. Only 2 percent of the bacterium’s genome had changed over the century, but those changes were associated with higher virulence, immune evasion and greater antimicrobial resistance.

When researchers like Dr. Baker discover a new species or strain, they can deposit it in the N.C.T.C.

Their science can then be reproducible, because other people can study it,” said Sarah Alexander, the collection’s lead scientist and curator. “There may be new applications for those strains.”

“From my perspective, it is one of the most important collections worldwide,” said Jörg Overmann, the director of the German Collection of Microorganisms and Cell Cultures, one of the world’s largest and most diverse.

The collection first opened in London in 1920 at the Lister Institute of Preventive Medicine. Its first 200 cultures — including Private Cable’s — were deposited by Sir Frederick William Andrewes, a pathologist who studied dysentery throughout World War I.

The organization sent 2,000 strains to various institutions for free over the next year. The bacteria were delivered alive, teeming on a medium of agar made from Dorset egg yolks and sealed with paraffin wax.

Safety protocols weren’t in place yet: In 1922, three N.C.T.C. researchers caught Tularemia, or rabbit fever, during an experiment in which they’d rubbed the spleen of a guinea pig infected with Francisella tularensis on the scarified skin of a healthy guinea pig.

The collection was transferred to a farmhouse north of London in 1939, a lucky move as the institute was bombed during World War II. In 1947, the curator honed the collection to medical and veterinary strains. The collection began charging scientists two shillings and sixpence per strain — about $5 today.

In the following decades, the growing collection moved back to London and raised its prices. Today it is a nonprofit that’s self-supporting through the sale of strains, which usually cost between $85 and $375.

“I need to make sure the collections are scientifically relevant and financially robust,” said Julie Russell, the head of culture collections at Public Health England, which also has collections of pathogenic viruses and fungi.

The N.C.T.C. holds many bacteria relevant to medical breakthroughs. Alexander Fleming, who discovered penicillin, deposited 16 strains into the collection between 1928 and 1948. Fleming sourced NCTC 4842, the bacterium Haemophilus influenzae, from his own nose. Betty Hobbs, a noted expert on food poisoning who identified Clostridium perfringens as the culprit behind many food-borne illnesses, deposited more than 20 related strains.

A subset of its holdings is the Murray Collection, assembled by Everitt George Dunne Murray in the first half of the 20th century from the stool, urine, blood, cerebrospinal fluid and other bodily products of sick people across the world. The sub-collection’s 683 strains span the period when antibiotics entered into general use.

“It gives us this snapshot of an era for which we don’t have much information but that is critical for understanding how we’ve got to the antimicrobial crisis that we’re in today,” Dr. Baker said.

The collection has also sequenced the genomes of about half the strains, making that data available publicly for genetic research.

In 2019, the collection sent 3,803 ampuls of bacteria to 63 countries. Among the most requested genera, were Clostridium (a leading cause of infectious diarrhea), E. coli (360 strains, some dangerous, others harmless), Staphylococcus (causing infections ranging from mild to fatal), Mycobacteriaceae (responsible for tuberculosis and leprosy, among others) and Salmonella (from contaminated food).

They’re shipped from a distribution center outside London under strict protocols, with safe handling instructions. Most bacteria are biosafety level two or three, which means they can cause serious or lethal diseases but have a cure. Level 4, the deadliest, includes only viruses.

The collection is also growing at a good clip.

“We receive anywhere between 50 and 200 strains a year from all sorts of sources,” said Jake Turnbull, a microbiologist at the collection.

Some are newly discovered, called type strains. Others are deposited from historical collections, or by scientists who retire or shift their research focus and want their strains to have a future.

New specimens are cultured on agar to make sure they’re alive and uncontaminated, suspended in a sugar-rich cryoprotectant broth, freeze-dried at about minus 28 degrees Fahrenheit for 3 to 4 hours, plugged with sterilized cotton, flame sealed in an evacuated glass ampul and stored at 39 degrees. Not all specimens survive long-term storage.

“The process we use is very similar to the one developed in the 1930s,” Ms. Russell said.

Each sample must come with a description of its origin, identification and special characteristics that are added to a searchable database.

“Fifty years ago, you may just get a handwritten letter with a strain,” Dr. Alexander said. “Now we may get a strain that’s had its whole genome sequenced.”

They accept 90 percent of samples they receive. Most are strains that are currently circulating, responsible for outbreaks or have novel antimicrobial resistance profiles.

One 2018 acquisition was NCTC 14208, a strain of Neisseria gonorrhoeae swabbed from the throat of a British man who recently contracted gonorrhea. The sexually transmitted disease, which infects nearly 80 million people every year, is becoming nearly untreatable.

“Evolutionarily, it’s quite an amazing bacterium,” Dr. Alexander said. “It has a lot of horizontal gene transfer. They swap antimicrobial resistance genes between the strains.”

The man had been given ceftriaxone and azithromycin — the only remaining treatment for the infection — but the bacteria beat them both. Another treatment eventually cured him.

In 2019, the collection sent 28 strains of N. gonorrhoeae to dozens of researchers. The strain from the British man, referred to as “super gonorrhea,” went to six.

In late February, the collection got a visit from Anna Dumitriu, who became its first artist-in-residence in 2018. Her work often done in collaboration with scientists frequently incorporates microorganisms.

She and a reporter observed the extraction of the super gonorrhea’s DNA in an N.C.T.C. molecular biology lab ahead of getting a lesson from Dr. Alexander on proteomics, a tool for studying antibiotic resistance that involves analyzing bacterial proteins.

“The loves of my life are chlamydia and gonorrhea,” Dr. Alexander, who has researched STDs for years, said during the session.

In anticipation of collection’s 100th anniversary, Ms. Dumitriu created the raw-silk plague dress impregnated with killed Yersinia pestis bacteria, which she extracted from the collection’s samples with Dr. Alexander’s help.

She also observed the process of how a strain becomes part of the collection, using a penicillin-resistant strain of Staphylococcus aureus swabbed from her own nose, à la Alexander Fleming, and added to the collection as NCTC 14139.

“I’ve used it in quite a lot of artworks and installations,” Ms. Dumitriu said.

She isn’t sure yet what creative expression the super gonorrhea strain will yield.

Antimicrobial resistance is likely to be one of the most pressing public health concerns for years to come. Of the 49 antibiotics currently in development, only four have been approved, and less than a quarter come from novel drug classes.

The samples being studied, donated and preserved in the N.C.T.C. and other culture collections will almost certainly play a role in medical breakthroughs decades in the future, just as Private Cable’s has a century after he died.

Dr. Alexander is keenly aware of this long-term view. Scientists who place their microorganisms in the collection “leave their legacy,” she said.

“You immortalize your science. We’re very much hopeful that in a hundred years’ time, people may be able to access strains that scientists deposited a hundred years ago.”

NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Brain structure that controls our behavior discovered

 

 

 

 

 

 

 

 

 

 

 

“For our social life and our profession we must be able to deal with our environment and other people. Executive functions, meaning the basic intellectual abilities that control human thought and action, help us to do this. These include selective attention, otherwise known as the ability to concentrate on one stimulus and suppress others, or the working memory, with which we can retain and manipulate information. These functions also enable us to plan actions and to divide them into individual steps.

However, some people do not succeed, finding it difficult to focus, plan their actions in a goal-oriented manner, and they have poor control over their impulses and emotions. They suffer from a condition called dysexecutive syndrome, which is often caused by craniocerebral trauma or a stroke.

One of those affected is a 56-year-old patient from Leipzig. She had suffered several strokes that hit a strategically very important region of the brain: The so-called inferior frontal junction area (IFJ) in the frontal lobe of the cerebral cortex in both hemispheres. The injury meant she was no longer able to pass basic psychological tests. These include, for example, the planning task zoo visit, in which a person is required to plan a tour of a zoo in accordance with various guidelines, or the Stroop test, which measures how well someone can suppress disturbing, unimportant stimuli in order to concentrate on an actual task.

The special feature of the examined patient: The lesion was limited to the IFJ alone, in both hemispheres of the brain equally (see figure). Normally, a stroke injures larger areas of the brain or is not restricted to such a defined area. In addition, it rarely affects the homologous areas in both hemispheres of the brain at the same time. As difficult as the situation is for the patient, it offers a unique opportunity for science to investigate the role of this region for executive functions.

“From functional MRI examinations on healthy persons, it was already known that the IFJ is increasingly activated when selective attention, working memory and the other executive functions are required. However, the final proof that these executive abilities are located there has not yet been provided,” explains Matthias Schroeter, first author of the underlying study and head of the research group “Cognitive Neuropsychiatry” at MPI CBS. However, causal evidence of such functional-anatomical relationships can only be obtained when the areas are actually switched off–and thus the abilities actually located there fail. “We were able to provide this proof with the help of this patient.”

And not only that; in addition to the classic approach — assigning individual functions to a specific brain region on the basis of brain damage and the corresponding impairments — the researchers also took the opposite approach: the “big data” approach via databases. These portals contain information from tens of thousands of participants from many psychological tests and the brain areas activated in the process. With their help, the researchers were able to predict the patient’s impairments solely on the basis of the brain damage determined by brain scans. Experts refer to this as symptom reading, a method which could be used in the future to adapt a therapy to individual patients and their brain damage without having to test it in detail.

“If patients suffer from a loss of executive functions after an accident or stroke, for example, they are usually less able to regenerate the other affected abilities because they find it difficult to plan for them,” said Schroeter. “In future, when the lesion images and databases provide us with more detailed information on which regions, and hence abilities, have failed, we will be able to adapt the therapy even more specifically.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

May 2020:

 

 

 

 

 

 

 

 

 

 

 

Body language can also be heard

 

 

 

 

 

 

 

 

 

 

 

 

 

“When people move their hands and arms while using their voices, listeners are able to hear it. Even without seeing the messenger, we can pick up each other’s body language. Wim Pouw of the Donders Institute has published this finding in PNAS this week.

Most people gesture while talking. From a small wrist movement to complete sign language involving arms, hands and fingers. This form of non-verbal communication supports what someone says, and, in some cases, it is even indispensable in order to explain something properly.

This is difficult if you cannot see the other person, you’d think. But even during phone calls, or when shouting from another room, we often talk with our hands. And that is not pointless at all, according to the study conducted by Wim Pouw and his colleagues from the University of Connecticut: your gestures resonate in your voice.

Louder and higher

It’s all about acoustics: the pitch and volume of a voice change together with the movement of arms and hands. “That change is very subtle with a wrist movement,” says Pouw. “It is less subtle with an arm movement. The pitch jumps up slightly whenever a movement slows down.”

According to Pouw, there are two different causes for these acoustic differences. One of the causes is the creation of vibrations. The forces involved in a movement cause vibrations in your body. Through the connective tissue that holds your body together, vibrations end up in your lungs, affecting the pressure in your lungs.

The second cause is muscle tension around your lungs that is needed to maintain balance. We do not merely use our arm muscles when we move our arms. “When starting the process of stopping your arm from moving, for example, other muscles are suddenly addressed to prevent your body from falling over. These muscles that maintain balance include muscles around your lungs.”

Unconsciously hearing an arm movement

As part of this study, Pouw instructed 3 men and 3 women to make a monotone sound, such as ‘aaaa’, while using all kinds of different hand and arm movements. After that, 30 subjects were asked to listen to the recordings. Not only did they guess which movements were made, but, in many cases, they were also able to mimic these movements simultaneously.

It is easy to measure differences in pitch and volume. It is remarkable that listeners unconsciously identify at what point movement causes these differences. “The subjects not only picked up the speed of the movement, but they also heard the location of the movement.”

Emphasize words by hand

Hence, a voice is more than just an abstract collection of sounds, according to the researcher, “When you hear a voice, you literally hear aspects of a person’s entire body.”

These findings go against the assumption that gestures basically only serve to depict or point out something. “It contributes to the understanding that there is a closer relationship between spoken language and gestures. Hand gestures may have been created to support the voice, to emphasize words, for example.”

 

The insights contribute to knowledge about ourselves and to knowledge about speech recognition. Think of systems such as Google Home and Siri. “When developing speech recognition, we have to take movement into account. Think of gestures, but also of someone running while talking, for example.”

By means of information about changes in voice, the way how systems can interpret meanings and weights of words can be improved. Or the other way around: noises in a voice produced by a speaker while running, does not affect the meaning of a word. “We can teach these kinds of systems what they should or should not filter out.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Bilingualism delays the brain’s aging process

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The next time you get the urge to lapse into English while conversing in your other language, try not to and your brain will thank you for it.

Researchers from the Singapore University of Technology and Design (SUTD) examined and found that active bilingualism – the regular balanced use of two languages and language switching – will offer protection against the brain’s aging process. The paper was published in the Journals of Gerontology: Psychological Sciences.

Current literature on the effects of bilingualism on the adult brain were inconsistent and lacking in clear trends – some reported that second language proficiency meant greater neural efficiency, whereas others concluded that it made no difference whatsoever. So the researchers from SUTD set out to explore the executive control mechanisms and context under which bilingualism can be a protective source against cognitive decline in the normal aging process.

Executive functions are complex, higher order processes that the brain performs. These functions mainly allow people to maintain their attention by focusing on relevant information and ignoring distractors, maintain information until execution as well as motor planning.

In the study that was conducted in Singapore, cognitively healthy seniors aged between 60 to 84 years old who were bilingual were tasked to complete an array of computerized executive control tasks.

Tasks selected were commonly used in previous studies and identified with reference to well-established theories involving older adults that showed decreased performance with aging. For a more holistic examination, the researchers measured six different domains of executive control using four different tasks, all of which had been previously associated with bilingualism, while controlling for individual variables such as age, processing speed and fluid intelligence.

It was found that active usage of two languages with less frequent language switching predicted better performance in the goal maintenance and conflict monitoring aspects of executive control. This suggests that bilingualism can be a protective source against cognitive decline in the normal aging process. Importantly, active bilingualism can be seen as a lifestyle factor that could buffer against cognitive declines that are associated with normal aging.

“The effort involved in not switching between languages and “staying” in the target language is more cognitively demanding than switching between languages while actively using both languages. Our study shows that the seniors developed more efficient neural organization at brain regions related to language control, which also overlap with areas involved in executive control,” explained lead principal investigator and corresponding author Associate Professor from SUTD.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Privacy and psychological issues of Video Conferencing

 

 

 

 

 

 

 

 

 

 

 

 

 

Zoom, Meet, Room and …all have privacy issues.

There’s a reason video apps make you feel awkward and unfulfilled.

“Last month, global downloads of the apps Zoom, Houseparty and Skype increased more than 100 percent as video conferencing and chats replaced the face-to-face encounters we are all so sorely missing. Their faces arranged in a grid reminiscent of the game show Hollywood Squares, people are attending virtual happy hours and birthday parties, holding virtual business meetings, learning in virtual classrooms and having virtual psychotherapy.

But there are reasons to be wary of the technology, beyond the widely reported security and privacy concerns. Psychologists, computer scientists and neuroscientists say the distortions and delays inherent in video communication can end up making you feel isolated, anxious and disconnected (or more than you were already). You might be better off just talking on the phone.

The problem is that the way the video images are digitally encoded and decoded, altered and adjusted, patched and synthesized introduces all kinds of artifacts: blocking, freezing, blurring, jerkiness and out-of-sync audio. These disruptions, some below our conscious awareness, confound perception and scramble subtle social cues. Our brains strain to fill in the gaps and make sense of the disorder, which makes us feel vaguely disturbed, uneasy and tired without quite knowing why.

Jeffrey Golde, an adjunct professor at Columbia Business School, has been teaching his previously in-person leadership class via Zoom for about a month now and he said it’s been strangely wearing. “I’ve noticed, not only in my students, but also in myself, a tendency to flag,” he said. “It gets hard to concentrate on the grid and it’s hard to think in a robust way.”

This is consistent with research on interpreters at the United Nations and at European Union institutions, who reported similar feelings of burnout, fogginess and alienation when translating proceedings via video feed. Studies on video psychotherapy indicate that both therapists and their patients also often feel fatigued, disaffected and uncomfortable.

Sheryl Brahnam, a professor in the department of information technology and cybersecurity at Missouri State University in Springfield, explains the phenomenon by comparing video conferencing to highly processed foods. “In-person communication resembles video conferencing about as much as a real blueberry muffin resembles a packaged blueberry muffin that contains not a single blueberry but artificial flavors, textures and preservatives,” she said. “You eat too many and you’re not going to feel very good.”

To be sure, video calls are great for letting toddlers blow kisses to their grandparents, showing people what you’re cooking for dinner or maybe demonstrating how to make a face mask out of boxer briefs. But if you want to really communicate with someone in a meaningful way, video can be vexing.

This is foremost because human beings are exquisitely sensitive to one another’s facial expressions. Authentic expressions of emotion are an intricate array of minute muscle contractions, particularly around the eyes and mouth, often subconsciously perceived, and essential to our understanding of one another. But those telling twitches all but disappear on pixelated video or, worse, are frozen, smoothed over or delayed to preserve bandwidth.

Not only does this mess with our perception, but it also plays havoc with our ability to mirror. Without realizing it, all of us engage in facial mimicry. Whenever we encounter another person. It’s a constant, almost synchronous, interplay. To recognize emotion, we have to actually embody it, which makes mirroring essential to empathy and connection. When we can’t do it seamlessly, as happens during a video chat, we feel unsettled because it’s hard to read people’s reactions and, thus, predict what they will do.

Our brains are prediction generators, and when there are delays or the facial expressions are frozen or out of sync, as happens on Zoom and Skype, we perceive it as a prediction error that needs to be fixed,” said Paula Niedenthal, a professor of psychology at the University of Wisconsin at Madison who specializes in affective response. “Whether subconscious or conscious, we’re having to do more work because aspects of our predictions are not being confirmed and that can get exhausting.”

Video chats have also been shown to inhibit trust because we can’t look one another in the eye. Depending on the camera angle, people may appear to be looking up or down or to the side. Viewers may then perceive them as uninterested, shifty, haughty, servile or guilty. For this reason, law scholars and criminal justice activists have questioned the fairness of remote depositions, hearings and trials. But as anyone who has been on a video call knows, people tend to look more at themselves than at the camera or even at others on the call. “I would be lying if I said I wasn’t super aware of my appearance on video chats,” said Dave Nitkiewicz, a recently furloughed employee of Experience Grand Rapids, the convention and visitors’ bureau in Grand Rapids, Mich. “I have the skin of Casper the Ghost right now — it’s, like, fluorescent — so I’m always concerned with framing and lighting.”

Craving company while confined at home, Mr. Nitkiewicz frequently arranges Zoom meet-ups with family and friends and he even went on a Zoom date. And yet he doesn’t find these interactions terribly satisfying.

“On video chat there’s literally a glowing box around your face when you’re talking, so you feel like every eyeball is on you, like a very intimidating job interview,” Mr. Nitkiewicz said. “The conversation kind of defaults to trivial drivel because people don’t want to take a risk.” And the delay in people’s feedback makes him feel that it wouldn’t be rewarding to share a good story anyway.

He doesn’t feel the same reserve when he talks on the phone, which he does for two or three hours every other Sunday with his cousin in Los Angeles. “We have for years and it’s never occurred to us to video chat,” said Mr. Nitkiewicz. “Our comfort place is still on the phone.”

This makes sense given that experts say no facial cues are better than faulty ones. The absence of visual input might even heighten people’s sensitivity to what’s being said. It could be why Verizon and AT&T have reported average daily increases of as much as 78 percent in voice-only calls since the start of the pandemic, as well as an increase in the length of these calls.

You can have a sense of hyper-presence on the telephone because of that coiled relationship where it feels like my mouth is right next to your ear, and vice versa,” said Dr. Brahnam during a telephone interview. Provided you have a good connection, she said, you end up hearing more: slight tonal shifts, brief hesitations and the rhythm of someone’s breathing. When it comes to developing intimacy remotely, sometimes it’s better to be heard and not seen.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

April 2020:

 

 

 

 

 

 

 

 

 

Origins of human language pathway in the brain: 26 million years old

 

 

 

 

 

 

 

 

 

 

“Scientists have discovered an earlier origin to the human language pathway in the brain, pushing back its evolutionary origin by at least 20 million years.

Previously, a precursor of the language pathway was thought by many scientists to have emerged more recently, about 5 million years ago, with a common ancestor of both apes and humans.

For neuroscientists, this is comparable to finding a fossil that illuminates evolutionary history. However, unlike bones, brains did not fossilize. Instead neuroscientists need to infer what the brains of common ancestors may have been like by studying brain scans of living primates and comparing them to humans.

Professor Chris Petkov from the Faculty of Medical Sciences, Newcastle University, the study lead said: “It is like finding a new fossil of a long lost ancestor. It is also exciting that there may be an older origin yet to be discovered still.”

The international teams of European and US scientists carried out the brain imaging study and analysis of auditory regions and brain pathways in humans, apes and monkeys which is published in Nature Neuroscience.

They discovered a segment of this language pathway in the human brain that interconnects the auditory cortex with frontal lobe regions, important for processing speech and language. Although speech and language are unique to humans, the link via the auditory pathway in other primates suggests an evolutionary basis in auditory cognition and vocal communication.

Professor Petkov added: “We predicted but could not know for sure whether the human language pathway may have had an evolutionary basis in the auditory system of nonhuman primates. I admit we were astounded to see a similar pathway hiding in plain sight within the auditory system of nonhuman primates.”

Remarkable transformation

The study also illuminates the remarkable transformation of the human language pathway. A key human unique difference was found: the human left side of this brain pathway was stronger and the right side appears to have diverged from the auditory evolutionary prototype to involve non-auditory parts of the brain.

The study relied on brain scans from openly shared resources by the global scientific community. It also generated original new brain scans that are globally shared to inspire further discovery. Also since the authors predict that the auditory precursor to the human language pathway may be even older, the work inspires the neurobiological search for its earliest evolutionary origin – the next brain ‘fossil’ – to be found in animals more distantly related to humans.

Professor Timothy Griffiths, consultant neurologist at Newcastle University, and joint senior author on the study notes: “This discovery has tremendous potential for understanding which aspects of human auditory cognition and language can be studied with animal models in ways not possible with humans and apes. The study has already inspired new research underway including with neurology patients.”

The study involved Newcastle University, Faculty of Medical Sciences; Max Planck Institute for Cognitive and Brain Sciences; Birkbeck UCL Centre for NeuroImaging; University of Texas MD Anderson Cancer Center, USA; University of Iowa, USA.

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Aerosolized particles carrying virus stay in the air longer than previously thought

 

 

 

 

 

 

 

 

 

 

 

 

The model revealed the aerosol-cloud containing COVID-19 spreads outside the immediate vicinity of the coughing person and dilutes in the process, but the dilution occurs over several minutes.

“A joint project carried out by four Finnish research organizations has studied the transport and spread of coronavirus through the air. Preliminary results indicate that aerosol particles carrying the virus can remain in the air longer than was originally thought, so it is important to avoid busy public indoor spaces. This also reduces the risk of droplet infection, which remains the main path of transmission for coronavirus.

Aalto University, Finnish Meteorological Institute, VTT Technical Research Centre of Finland and University of Helsinki have studied how extremely small airborne aerosol particles emitted from the respiratory tract when coughing, sneezing or even talking are transported in the air. Such particles can carry pathogens such as coronaviruses.

The researchers modeled a scenario where a person coughs in an aisle between shelves, like those found in grocery stores; and taking into consideration the ventilation. Aalto University, VTT Technical Research Centre of Finland and Finnish Meteorological Institute each carried out the modeling independently, using the same starting conditions.

The researchers obtained the same preliminary result: in the situation under investigation, the aerosol cloud spreads outside the immediate vicinity of the coughing person and dilutes in the process. However, this can take up to several minutes. ‘Someone infected by the coronavirus, can cough and walk away, but then leave behind extremely small aerosol particles carrying the coronavirus. These particles could then end up in the respiratory tract of others in the vicinity’, explains Aalto University Assistant Professor Ville Vuorinen.

‘The preliminary results obtained by the consortium highlight the importance of our recommendations. The Finnish Institute for Health and Welfare recommends that you stay at home if you are unwell and that you maintain physical distance with everyone. The instructions also include coughing into your sleeve or a tissue and taking care of good hand hygiene’, says Jussi Sane, Chief Specialist at the Finnish Institute for Health and Welfare.

‘Based on the modeling of the consortium, it is not yet possible to directly issue new recommendations. However, these results are an important part of the whole, and they should be compared with the data from real-life epidemic studies,’ Sane adds.

The spread of diseases through social networks has been studied extensively. From these infection models, it is known that the spread of a virus may slow down or even be suppressed altogether as mobility decreases at ‘nodal points’ – places where lots of people gather, such as shops, restaurants and public transport. Avoiding busy indoor areas reduces the risk of droplet infection while in close proximity to others, which, according to current information, is the main cause of coronavirus infection.

The researchers of the consortium modeled the airborne movement of aerosol particles smaller than 20 micrometers. For a dry cough, which is a typical symptom of the current coronavirus, the particle size is typically less than 15 micrometers. Extremely small particles of this size do not sink on the floor, but instead, move along in the air currents or remain floating in the same place. Studies of influenza A have confirmed that the influenza A virus can be found in the smallest particles, which measure less than 5 micrometers.

Supercomputer used for modeling

The project involves around 30 researchers, whose specializations include fluid dynamics, aerosol physics, social networks, ventilation, virology, and biomedical engineering. The research is being carried out in conjunction with Essote (the joint municipal authority for social and health services in South Savo), which proposed the research project, as well as infectious diseases specialists from the Finnish Institute for Health and Welfare. The airborne transport and preservation of droplets leaving the respiratory tract were simulated using a supercomputer, and 3D visualization of the results was then carried out. CSC – Finnish IT Center for Science Ltd. made its supercomputer available to researchers at very short notice. Thanks to the high computing capacity and close, multidisciplinary cooperation, the first results were produced in around a week.

The physics of the phenomena now being modeled are very familiar from previous research. The consortium aims to use visualization to create a better understanding of the behavior of aerosol particles. Researchers will continue to work on the modeling and further refine it. Experts in infectious diseases and virology will examine the results and their importance in relation to the information being gathered on coronavirus and coronavirus infections. The involvement of two Swedish universities has further strengthened the consortium.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Researchers Identify Virus and Two Types of Bacteria as Major Causes of Alzheimer’s

 

 

 

 

 

 

 

 

 

 

 

 

“A worldwide team of senior scientists and clinicians have come together to produce an editorial which indicates that certain microbes – a specific virus and two specific types of bacteria – are major causes of Alzheimer’s Disease. Their paper, which has been published online in the highly regarded peer-reviewed journal, Journal of Alzheimer’s Disease, stresses the urgent need for further research – and more importantly, for clinical trials of anti-microbial and related agents to treat the disease.

This major call for action is based on substantial published evidence into Alzheimer’s. The team’s landmark editorial summarises the abundant data implicating these microbes, but until now this work has been largely ignored or dismissed as controversial – despite the absence of evidence to the contrary. Therefore, proposals for the funding of clinical trials have been refused, despite the fact that over 400 unsuccessful clinical trials for Alzheimer’s based on other concepts were carried out over a recent 10-year period.

Opposition to the microbial concepts resembles the fierce resistance to studies some years ago which showed that viruses cause certain types of cancer, and that a bacterium causes stomach ulcers. Those concepts were ultimately proved valid, leading to successful clinical trials and the subsequent development of appropriate treatments.

Professor Douglas Kell of The University of Manchester’s School of Chemistry and Manchester Institute of Biotechnology is one of the editorial’s authors. He says that supposedly sterile red blood cells were seen to contain dormant microbes, which also has implications for blood transfusions.

“We are saying there is incontrovertible evidence that Alzheimer’s Disease has a dormant microbial component, and that this can be woken up by iron dysregulation. Removing this iron will slow down or prevent cognitive degeneration – we can’t keep ignoring all of the evidence,” Professor Douglas Kell said.

Professor Resia Pretorius of the University of Pretoria, who worked with Douglas Kell on the editorial, said “The microbial presence in blood may also play a fundamental role as causative agent of systemic inflammation, which is a characteristic of Alzheimer’s disease – particularly, the bacterial cell wall component and endotoxin, lipopolysaccharide. Furthermore, there is ample evidence that this can cause neuroinflammation and amyloid-β plaque formation.”

The findings of this editorial could also have implications for the future treatment of Parkinson’s Disease, and other progressive neurological conditions.”

Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

Turning Back the Clock on Aging Cells

Researchers report that they can rejuvenate human cells by reprogramming them to a youthful state.

 

 

 

 

 

 

 

 

 

 

“Researchers at Stanford University report that they can rejuvenate human cells by reprogramming them back to a youthful state. They hope that the technique will help in the treatment of diseases, such as osteoarthritis and muscle wasting, that are caused by the aging of tissue cells.

A major cause of aging is thought to be the errors that accumulate in the epigenome, the system of proteins that packages the DNA and controls access to its genes. The Stanford team, led by Dr. Thomas A. Rando and Vittorio Sebastiano, say their method, designed to reverse these errors and walk back the cells to their youthful state, does indeed restore the cells’ vigor and eliminate signs of aging.

In their report, they described their technique as “a significant step toward the goal of reversing cellular aging” and could produce therapies “for aging and aging-related diseases.”

Leonard P. Guarente, an expert on aging at M.I.T., said the method was “one of the most promising areas of aging research” but that it would take a long time to develop drugs based on RNA, the required chemical.

The Stanford approach utilizes powerful agents known as Yamanaka factors, which reprogram a cell’s epigenome to its time zero, or embryonic state.

Embryonic cells, derived from the fertilized egg, can develop into any of the specialized cell types of the body. Their fate, whether to become a skin or eye or liver cell, is determined by chemical groups, or marks, that are tagged on to their epigenome.

In each type of cell, these marks make accessible only the genes that the cell type needs, while locking down all other genes in the DNAs. The pattern of marks thus establishes each cell’s identity.

As the cell ages, it accumulates errors in the marking system, which degrade the cell’s efficiency at switching on and off the genes needed for its operations.

But the Yamanaka factors are no simple panacea. The trial on mice resulted in the death of all in the lab. Applied to whole mice, the factors made cells lose their functions and primed them for rapid growth, usually cancerous; the mice all died.

In 2016, Juan Carlos Izpisua Belmonte, of the Salk Institute for Biological Studies in San Diego, found that the two effects of the Yamanaka factors — erasing cell identity and reversing aging — could be separated, with a lower dose securing just age reversal. But he achieved this by genetically engineering mice, a technique not usable in people.

In their paper on Tuesday, the Stanford team described a feasible way to deliver Yamanaka factors to cells taken from patients, by dosing cells kept in cultures with small amounts of the factors.

If dosed for a short enough time, the team reported, the cells retained their identity but returned to a youthful state, as judged by several measures of cell vigor.

Dr. Sebastiano said the Yamanaka factors appeared to operate in two stages, as if they were raising the epigenome’s energy to one level, at which the marks of aging were lost, and then to a higher level at which cell identity was erased.

The Stanford team extracted aged cartilage cells from patients with osteoarthritis and found that after a low dosage of Yamanaka factors the cells no longer secreted the inflammatory factors that provoke the disease. The team also found that human muscle stem cells, which are impaired in a muscle-wasting disease, could be restored to youth. Members of the Stanford team have formed a company, Turn Biotechnologies, to develop therapies for osteoarthritis and other diseases.

The study is “definitively a step forward in the goal of reversing cellular aging,” Dr. Izpisua Belmonte said.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

March 2020:

 

 

 

 

 

 

 

 

 

How Long Will Coronavirus Live on Surfaces or in the Air Around You?

 

 

 

 

 

 

 

 

 

 

A new study could have implications for how the general public and health care workers try to avoid transmission of the virus.

“The coronavirus can live for three days on some surfaces, like plastic and steel, new research suggests. Experts say the risk of consumers getting infected from touching those materials is still low, although they offered additional warnings about how long the virus survives in air, which may have important implications for medical workers.

The new study suggests that the virus disintegrates over the course of a day on cardboard, lessening the worry among consumers that deliveries will spread the virus during this period of staying and working from home.

When the virus becomes suspended in droplets smaller than 5 micrometers — known as aerosols — it can stay suspended for about a half-hour, researchers said, before drifting down and settling on surfaces where it can linger for hours. The finding on aerosol in particular is inconsistent with the World Health Organization’s position that the virus is not transported by air.

The virus lives longest on plastic and steel, surviving for up to 72 hours. But the amount of viable virus decreases sharply over this time. It also does poorly on copper, surviving four hours. On cardboard, it survives up to 24 hours, which suggests packages that arrive in the mail should have only low levels of the virus — unless the delivery person has coughed or sneezed on it or has handled it with contaminated hands.

That’s true in general. Unless the people who handle any of these materials are sick, the actual risk of getting infected from any of these materials is low, experts said. Everything at the grocery store and restaurant takeout containers and bags could in theory have infectious virus on them,” said Dr. Linsey Marr, who was not a member of the research team but is an expert in the transmission of viruses by aerosol at Virginia Tech in Blacksburg. “We could go crazy discussing these ‘what-ifs’ because everyone is a potential source, so we have to focus on the biggest risks.”

If people are concerned about the risk, they could wipe down packages with disinfectant wipes and wash their hands, she said.

It is unclear why cardboard should be a less hospitable environment for the virus than plastic or steel, but it may be explained by the absorbency or fibrous quality of the packaging compared with the other surfaces.

That the virus can survive and stay infectious in aerosols is also important for health care workers.

For weeks experts have maintained that the virus is not airborne. But in fact, it can travel through the air and stay suspended for that period of about a half-hour.

The virus does not linger in the air at high enough levels to be a risk to most people who are not physically near an infected person. But the procedures health care workers use to care for infected patients are likely to generate aerosols.

“Once you get a patient in with severe pneumonia, the patients need to be intubated,” said Dr. Vincent Munster, a virologist at the National Institute of Allergy and Infectious Diseases who led the study. “All these handlings might generate aerosols and droplets.”

Health care workers might also collect those tiny droplets and larger ones on their protective gear when working with infected patients. They might resuspend these big and small droplets into the air when they take off this protective gear and become exposed to the virus then, Dr. Marr cautioned.

And another study, published in JAMA, also indicates that the virus is transported by air. That study, based in Singapore, found the virus on a ventilator in the hospital room of an infected patient, where it could only have reached via the air.

Dr. Marr said the World Health Organization has so far referred to the virus as not airborne, but that health care workers should wear gear, including respirator masks, assuming that it is.

Based on aerosol science and recent findings on flu virus,” she said, “surgical masks are probably insufficient.”

Dr. Marr said based on physics, an aerosol released at a height of about six feet would fall to the ground after 34 minutes. The findings should not cause the general public to panic, however, because the virus disperses quickly in the air.

“It sounds scary,” she said, “but unless you’re close to someone, the amount you’ve been exposed to is very low.”

Dr. Marr compared this to cigarette smoke or a foggy breath on a frosty day. The closer and sooner another person is to the exhaled smoke or breath, the more of a whiff they might catch; for anyone farther than a few feet away, there is too little of the virus in the air to be any danger.

To assess the ability of the virus to survive in the air, the researchers created what Dr. Munster described as “bizarre experiments done under very ideal controllable experimental conditions.” They used a rotating drum to suspend the aerosols, and provided temperature and humidity levels that closely mimic hospital conditions.

In this setup, the virus survived and stayed infectious for up to three hours, but its ability to infect drops sharply over this time, he said.

He said the aerosols might only stay aloft for about 10 minutes, but Dr. Marr disagreed with that assessment, and said they could stay in the air for three times longer. She also said that the experimental setup might be less comfortable for the virus than a real-life setting.

For example, she said, the researchers used a relative humidity of 65 percent. “Many, but not all viruses, have shown that they survive worst at this level of humidity,” she said. They do best at lower or much higher humidity. The humidity in a heated house is less than 40 percent, “at which the virus might survive even longer,” she said.

Mucus and respiratory fluids might also allow the virus to survive longer than the laboratory fluids the researchers used for their experiments.

Other experts said the paper’s findings illustrate the urgent need for more information about the virus’ ability to survive in aerosols, and under different conditions.

“We need more experiments like this, in particular, extending the experimental sampling time for aerosolized virus beyond three hours and testing survival under different temperature and humidity conditions,” said Dr. Jeffrey Shaman, an environmental health sciences expert at Columbia University.

Dr. Munster noted that, overall, the new coronavirus seems no more capable of surviving for long periods than its close cousins SARS and MERS, which caused previous epidemics. That suggests there are other reasons, such as transmission by people who don’t have symptoms, for its ability to cause a pandemic.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Pandemics Damages Compassion, Too

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Some disasters, like hurricanes and earthquakes, can bring people together, but if history is any judge, pandemics generally drive them apart. These are crises in which social distancing is a virtue. Dread overwhelms the normal bonds of human affection.

In “The Decameron,” Giovanni Boccaccio writes about what happened during the plague that hit Florence in 1348: “Tedious were it to recount how citizen avoided citizen, how among neighbors was scarce found any that shewed fellow-feeling for another, how kinfolk held aloof, and never met … “

In his book on the 1665 London epidemic, “A Journal of the Plague Year,” Daniel Defoe reports, “This was a time when every one’s private safety lay so near them they had no room to pity the distresses of others. … nay, what is more, and scarcely to be believed, fathers and mothers were found to abandon their own children, untended, unvisited, to their fate….The danger of immediate death to ourselves, took away all bonds of love, all concern for one another.”

Fear drives people in these moments, but so does shame, caused by the brutal things that have to be done to slow the spread of the disease. In all pandemics people are forced to make the decisions that doctors in Italy are now forced to make — withholding care from some of those who are suffering and leaving them to their fate.

In 17th-century Venice, health workers searched the city, identified plague victims and shipped them off to isolated “hospitals,” where two-thirds of them died. In many cities over the centuries, municipal authorities locked whole families in their homes, sealed the premises and blocked any delivery of provisions or medical care.

Frank Snowden, the Yale historian who wrote “Epidemics and Society,” argues that pandemics hold up a mirror to society and force us to ask basic questions: What is possible imminent death trying to tell us? Where is God in all this? What’s our responsibility to one another?

Pandemics induce a feeling of enervating fatalism. People realize how little they control their lives. Anton Chekhov was a victim during a TB epidemic that traveled across Russia in the late 19th century. Snowden points out that the plays he wrote during his recovery are about people who feel trapped, waiting for events outside their control, unable to act, unable to decide.

Pandemics also hit the poor hardest and inflame class divisions.

Cholera struck Naples in 1884, especially the Lower City, where the poor lived. Rumors swept the neighborhood that city officials were deliberately spreading the disease. When highhanded public health workers poured into Lower City, the locals revolted, throwing furniture at them, hurling them down stairs.

The city thought the disease was passed on by people eating unripe or overripe fruit. The peasants responded by bringing baskets of fruit to City Hall and gorging on it in public — a way to hold up a defiant middle finger against the elites who were so useless in the face of the disease.

The Spanish flu pandemic that battered America in 1918 produced similar reactions. John M. Barry, author of “The Great Influenza,” reports that as conditions worsened, health workers in city after city pleaded for volunteers to care for the sick. Few stepped forward.

In Philadelphia, the head of emergency aid pleaded for help in taking care of sick children. Nobody answered. The organization’s director turned scornful: “Hundreds of women … had delightful dreams of themselves in the roles of angels of mercy. … Nothing seems to rouse them now. … There are families in which every member is ill, in which the children are actually starving because there is no one to give them food. The death rate is so high, and they still hold back.”

This explains one of the puzzling features of the 1918 pandemic. When it was over, people didn’t talk about it. There were very few books or plays written about it. Roughly 675,000 Americans lost their lives to the flu, compared with 53,000 in battle in World War I, and yet it left almost no conscious cultural mark.

Perhaps it’s because people didn’t like who they had become. It was a shameful memory and therefore suppressed. In her 1976 dissertation, “A Cruel Wind,” Dorothy Ann Pettit argues that the 1918 flu pandemic contributed to a kind of spiritual torpor afterward. People emerged from it physically and spiritually fatigued. The flu, Pettit writes, had a sobering and disillusioning effect on the national spirit.

There is one exception to this sad litany: health care workers. In every pandemic there are doctors and nurses who respond with unbelievable heroism and compassion. That’s happening today.

Mike Baker recently had a report about the EvergreenHealth hospital in Kirkland, Wash., where the staff showing the kind of effective compassion that has been evident in all pandemics down the centuries. “We have not had issues with staff not wanting to come in,” an Evergreen executive said. “We’ve had staff calling and say, ‘If you need me, I’m available.”

Maybe this time we’ll learn from their example. It also wouldn’t be a bad idea to take steps to fight the moral disease that accompanies the physical one.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

How smell, emotion, and memory are intertwined

 

 

 

 

 

 

 

 

 

“…. I carried to my lips a spoonful of the tea in which I had let soften a bit of madeleine. But at the very instant when the mouthful of tea mixed with cake crumbs touched my palate, I quivered, attentive to the extraordinary thing that was happening inside me.

It’s a seminal passage in literature, so famous in fact, that it has its own name: the Proustian moment — a sensory experience that triggers a rush of memories often long past, or even seemingly forgotten. For French author Marcel Proust, who penned the legendary lines in his 1913 novel, “À la recherche du temps perdu,” it was the soupçon of cake in tea that sent his mind reeling.

But according to a biologist and an olfactory branding specialist Wednesday, it was the nose that was really at work.

This should not be surprising, as neuroscience makes clear. Smell and memory seem to be so closely linked because of the brain’s anatomy, said Harvard’s Raymond Leo Erikson Life Sciences Professor and chair of the Department of Molecular and Cellular Biology. He walked the audience through the science early in the panel discussion “Olfaction in Science and Society,” sponsored by the Harvard Museum of Natural History in collaboration with the Harvard Brain Science Initiative.

Smells are handled by the olfactory bulb, the structure in the front of the brain that sends information to the other areas of the body’s central command for further processing. Odors take a direct route to the limbic system, including the amygdala and the hippocampus, the regions related to emotion and memory. “The olfactory signals very quickly get to the limbic system,” He said.

But, as with Proust, taste plays a role, too, whose lab explores the neural and algorithmic basis of odor-guided behaviors in terrestrial animals.

When you chew, molecules in the food, he said, “make their way back retro-nasally to your nasal epithelium,” meaning that essentially, “all of what you consider flavor is smell. When you are eating all the beautiful, complicated flavors … they are all smell.” He said you can test that theory by pinching your nose when eating something such as vanilla or chocolate ice cream. Instead of tasting the flavor, he said, “all you can taste is sweet.”

For decades individuals and businesses have explored ways to harness the evocative power of smell. Think of the cologne or perfume worn by a former flame. And then there was AromaRama or Smell-O-Vision, brainchildren of the film industry of the 1950s that infused movie theaters with appropriate odors in an attempt pull viewers deeper into a story — and the most recent update, the decade-old 4DX system, which incorporates special effects into movie theaters, such as shaking seats, wind, rain, as well as smells. Several years ago, Harvard scientist David Edwards worked on a new technology that would allow iPhones to share scents as well as photos and texts.

Today, the aroma of a home or office is big business. Scent branding is in vogue across a range of industries, including hotels that often pump their signature scents into rooms and lobbies, noted the authors of 2018 Harvard Business Review article.

“In an age where it’s becoming more and more difficult to stand out in a crowded market, you must differentiate your brand emotionally and memorably,” they wrote. “Think about your brand in a new way by considering how scent can play a role in making a more powerful impression on your customers.”

Someone who knows that lesson well is Dawn Goldworm, co-founder and nose, or scent, director of what she calls her “olfactive branding company,” 12.29, which uses the “visceral language of scent to transform brand-building” in the actual buildings where clients reside (mostly through ventilation systems or standalone units).

Among Goldworm’s high-profile customers is the sportswear giant Nike. Its signature scent, she explains in a video on her company’s website, was inspired by, among other things, the smell of a rubber basketball sneaker as it scrapes across the court and a soccer cleat in grass and dirt. Her goal, she said, is to create “immediate and memorable connections between brands and consumers.”

Goldworm, who designed signature fragrances for celebrities for more than a decade before starting her own company, knows the science, too. She spent five years in perfumery school followed by a master’s degree at New York University where her thesis focused on olfactory branding.

During the talk she explained that smell is the only fully developed sense a fetus has in the womb, and it’s the one that is the most developed in a child through the age of around 10 when sight takes over. And because “smell and emotion are stored as one memory,” said Goldworm, childhood tends to be the period in which you create “the basis for smells you will like and hate for the rest of your life.” She also explained that people tend to smell in color, demonstrating the connection with pieces of paper dipped in scents that she handed to the audience. Like most people, her listeners associated citrus-flavored mandarin with the colors orange, yellow, and green. When smelling vetiver, a grassy scent, audience members envisioned green and brown.

People do tend to lose their sense of smell as they age, she added. But not to worry. Your nose is like a muscle in the body that can be strengthened, she said, by giving it a daily workout, not with weights, but with sniffs.

“Just pay attention,” with your nose, said Goldworm. “When you are walking down the street, consciously indicate what you are smelling … the more you use [your nose], the stronger it gets.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Extraordinary Life of André Bamberski and his fight for justice

 

 

 

 

 

Frustrated with Germany’s refusal to hand Krombach over, Bamberski hired a kidnap team to snatch the doctor from his home in Scheidegg, Bavaria, bundle him into a blacked-out limousine and bring him to France.

Doctor who murdered French girl is freed 11 years after her father ( André Bamberski ) kidnapped him in Germany and smuggled him across border to stand trial

The Cardiologist was jailed 2011 after being kidnapped by victim’s biological father  

A German ex-doctor and serial rapist who killed André Bamberski’s teenage daughter by lethal injection 38 years ago has now been freed on medical grounds.

Dieter Krombach, 84, was whisked by ambulance from his Paris prison last week after serving nine of his 15-year sentence for the death of Kalinka Bamberski. 

It draws a line under one of the most dramatic cases of recent times, which saw the 14-year-old victim’s biological father kidnap Krombach and drag him to France after Germany refused to hand him over for trial. 

After his bound and gagged body was dumped outside a courthouse, Krombach was tried and jailed in 2011.

Last October, a parole committee decreed his sentence should be suspended on medical grounds, which resulted in the doctor taken away by ambulance from a prison near Paris.

A German investigation in 1987 had found there was not enough evidence to charge Krombach of the murder of Kalinka, who was found dead in her bed during the summer holidays of 1982. 

But the doctor’s credibility was damaged when in 1997 he was found guilty of drugging and raping a 16-year-old patient, emboldening Kalinka’s biological father, Andre Bamberski, in his campaign to see Krombach arrested.

Frustrated with Germany’s refusal to hand Krombach over, Bamberski hired a kidnap team to snatch the doctor from his home in Scheidegg, Bavaria, bundle him into a blacked-out limousine and bring him to France.

André Bamberski and his daughter, Kalinka, 38 years ago

 

The doctor was left, bound and gagged, near a courthouse in the border town of Mulhouse, and later put on trial.

The doctor, whose lover Daniele Gonnin was Kalinka’s mother, is said to have injected the girl with an overdose of a substance used for tanning. 

After Krombach was convicted by a French court in 2011, Kalinka’s father was also put on trial and convicted in 2014 over the kidnapping. 

He was handed a suspended one-year jail sentence.

The two men who carried out the kidnapping – Anton Krasniqi of Kosovo and Georgian Kacha Bablovani – were sentenced to a year in prison each.

The drama of Andre Bamberski’s fight for justice has been turned into the 2016 movie, ‘Au Nom de Ma Fille,’ with the English title ‘Kalinka’.” Daily Mail

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

February 2020:

 

 

 

 

 

 

 

 

Sitting still linked to increased risk of depression in adolescents

 

 

 

 

 

 

 

“Too much time sitting still – sedentary behavior – is linked to an increased risk of depressive symptoms in adolescents, finds a new UCL-led study.

The Lancet Psychiatry study found that an additional 60 minutes of light activity (such as walking or doing chores) daily at age 12 was associated with a 10% reduction in depressive symptoms at age 18.

“Our findings show that young people who are inactive for large proportions of the day throughout adolescence face a greater risk of depression by age 18. We found that it’s not just more intense forms of activity that are good for our mental health, but any degree of physical activity that can reduce the time we spend sitting down is likely to be beneficial,” said the study’s lead author, PhD student Aaron Kandola (UCL Psychiatry).

“We should be encouraging people of all ages to move more, and to sit less, as it’s good for both our physical and mental health.”

The research team used data from 4,257 adolescents, who have been participating in longitudinal research from birth as part of the University of Bristol’s Children of the 90s cohort study. The children wore accelerometers to track their movement for at least 10 hours over at least three days, at ages 12, 14 and 16.

The accelerometers reported whether the child was engaging in light activity (which could include walking or hobbies such as playing an instrument or painting), engaging in moderate-to-physical activity (such as running or cycling), or if they were sedentary. The use of accelerometers provided more reliable data than previous studies which have relied on people self-reporting their activity, which have yielded inconsistent results.

Depressive symptoms, such as low mood, loss of pleasure and poor concentration, were measured with a clinical questionnaire. The questionnaire measures depressive symptoms and their severity on a spectrum, rather than providing a clinical diagnosis.

Between the ages of 12 and 16, total physical activity declined across the cohort, which was mainly due to a decrease in light activity (from an average of five hours, 26 minutes to four hours, five minutes) and an increase in sedentary behaviour (from an average of seven hours and 10 minutes to eight hours and 43 minutes).

The researchers found that every additional 60 minutes of sedentary behaviour per day at age 12, 14 and 16 was associated with an increase in depression score of 11.1%, 8% or 10.5%, respectively, by age 18. Those with consistently high amounts of time spent sedentary at all three ages had 28.2% higher depression scores by age 18.

Every additional hour of light physical activity per day at age 12, 14 and 16 was associated with depression scores at age 18 that were 9.6%, 7.8% and 11.1% lower, respectively.

The researchers found some associations between moderate-to-vigorous activity at earlier ages and reduced depressive symptoms, although they caution that their data was weaker due to low levels of activity of such intensity in the cohort (averaging around 20 minutes per day), so the findings do not clarify whether moderate-to-vigorous activity is any less beneficial than light activity.

While the researchers cannot confirm that the activity levels caused changes to depressive symptoms, the researchers accounted for potentially confounding factors such as socioeconomic status, parental history of mental health, and length of time wearing the accelerometer, and avoided the possibility of reverse causation by adjusting their analysis to account for people with depressive symptoms at the study outset.

“Worryingly, the amount of time that young people spend inactive has been steadily rising for years, but there has been a surprising lack of high quality research into how this could affect mental health. The number of young people with depression also appears to be growing and our study suggests that these two trends may be linked,” Kandola added.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Amazon’s first and second employees:

Amazon “scares me”, “is unfair”

 

 

“Now Amazon’s first employee, Shel Kaphan, says a breakup of Amazon “could potentially make sense.”

 

 

In an interview for a new PBS Frontline documentary about Amazon viewed by Recode, which airs February 18th, Kaphan said the company’s rise to power has left him conflicted.

“On one hand I’m proud of what it became,” Kaphan told the documentary’s host, James Jacoby. “But it also scares me.”

“I think not all of the effects of the company on the world are the best,” he added. “And I wish it wasn’t so, but I had something to do with bringing it into existence; it’s partly on me.”

 

NYTimes

Asked if he favors a breakup of the company, which has been advocated by Senator Elizabeth Warren and some anti-monopoly activists, Kaphan said, “I think they’re now at the scale where that could potentially make sense.”

Warren wants to separate Amazon’s retail platform, on which other merchants sell goods, from where Amazon sells its own lines of goods, like AmazonBasics. Her plan is fueled by an argument that Amazon uses data it collects from other merchants in its marketplace to boost its own brands. Other critics have called for Amazon to separate its Amazon Web Services (AWS) cloud computing business from its retail business, based on a belief that Amazon uses profits from AWS to subsidize low prices in its retail business.”

Kaphan’s views, combined with recent comments from Amazon’s second employee, Paul Davis, highlight how some early employees of tech giants are grappling with their roles in birthing companies that have amassed so much power in society today. Davis told Recode in December that Amazon’s role as both a retailer and the operator of a retail marketplace on which other merchants sell goods — and Amazon’s access to their data that can be used to compete against them — is unfair.” VOX

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Despite being banned, Big Tobacco is still on social media

 

 

 

 

 

 

 

 

“Big Tobacco likes to stay ahead of the curve – to survive, it has to. Its fundamental problem is that one in two of its long-term users die from tobacco-related diseases. To hook a new generation into addiction, it has to try every advertising and marketing trick in its playbook.

And it has to be innovative. As one ex-marketing consultant remarked: “The problem, is how do you sell death?” He said the industry did it with great open spaces, such as mountains and lakes. They did it with healthy young people and iconic images. So the Marlboro Man became a symbol of masculinity and, for women, the industry promoted smoking as a “torch of freedom”.

In August 2018, the New York Times investigated Big Tobacco’s social media influence. The paper found 123 hashtags associated with companies’ tobacco products, which had been viewed a staggering 25 billion times. Robert Kozinets, a professor at the University of Southern California, told the newspaper that what the industry was doing was a “really effective way” to get around existing laws to restrict advertising to young people.

The pressure on the industry to act increased in May 2019 when 125 public health organisations called on Facebook, Instagram, Twitter and Snapchat to immediately end the promotion of cigarettes and e-cigarettes. This included banning the use of social media influencers. The social media companies ignored the request.

In December 2019, in a landmark decision, the UK Advertising Standards Authority ruled against British American Tobacco and three other firms for promoting their products on Instagram, after a complaint by Action on Smoking and Health, Campaign for Tobacco-Free Kids and Stopping Tobacco Organisations and Products, of which the University of Bath’s Tobacco Control Research Group is a partner.

In a follow-up statement, Facebook and Instagram announced what many saw as a long-overdue update to their policy on tobacco. It said that branded content that promotes goods such as vaping, tobacco products and weapons “will not be allowed”. The statement made the bold claim that their advertising policies had long “prohibited” the advertisement of these products. The platforms promised that enforcement would begin in the coming weeks.” Independent

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Living near major roads linked to risk of dementia, Parkinson’s, Alzheimer’s and MS

 

 

 

 

 

 

 

 

“Living near major roads or highways is linked to a higher incidence of dementia, Parkinson’s disease, Alzheimer’s disease and multiple sclerosis (MS) suggests new research published this week in the journal Environmental Health.

Researchers from the University of British Columbia analyzed data for 678,000 adults in Metro Vancouver. They found that living less than 50 meters from a major road or less than 150 meters from a highway is associated with a higher risk of developing dementia, Parkinson’s, Alzheimer’s and MS–likely due to increased exposure to air pollution.

The researchers also found that living near green spaces, like parks, has protective effects against developing these neurological disorders.

“For the first time, we have confirmed a link between air pollution and traffic proximity with a higher risk of dementia, Parkinson’s, Alzheimer’s and MS at the population level,” says Weiran Yuchi, the study’s lead author and a PhD candidate in the UBC school of population and public health. “The good news is that green spaces appear to have some protective effects in reducing the risk of developing one or more of these disorders. More research is needed, but our findings do suggest that urban planning efforts to increase accessibility to green spaces and to reduce motor vehicle traffic would be beneficial for neurological health.”

Neurological disorders–a term that describes a range of disorders, including Alzheimer’s disease and other dementias, Parkinson’s disease, multiple sclerosis, and motor neuron diseases–are increasingly recognized as one of the leading causes of death and disability worldwide. Little is known about the risk factors associated with neurological disorders, the majority of which are incurable and typically worsen over time.

For the study, researchers analyzed data for 678,000 adults between the ages of 45 and 84 who lived in Metro Vancouver from 1994 to 1998 and during a follow-up period from 1999 to 2003. They estimated individual exposures to road proximity, air pollution, noise and greenness at each person’s residence using postal code data. During the follow-up period, the researchers identified 13,170 cases of non-Alzheimer’s dementia, 4,201 cases of Parkinson’s disease, 1,277 cases of Alzheimer’s disease and 658 cases of MS.

For non-Alzheimer’s dementia and Parkinson’s disease specifically, living near major roads or a highway was associated with 14 percent and seven percent increased risk of both conditions, respectively. Due to relatively low numbers of Alzheimer’s and MS cases in Metro Vancouver compared to non-Alzheimer’s dementia and Parkinson’s disease, the researchers did not identify associations between air pollution and increased risk of these two disorders. However, they are now analyzing Canada-wide data and are hopeful the larger dataset will provide more information on the effects of air pollution on Alzheimer’s disease and MS.

When the researchers accounted for green space, they found the effect of air pollution on the neurological disorders was mitigated. The researchers suggest that this protective effect could be due to several factors.

“For people who are exposed to a higher level of green space, they are more likely to be physically active and may also have more social interactions,” said Michael Brauer, the study’s senior author and professor in the UBC school of population and public health. “There may even be benefits from just the visual aspects of vegetation.”

Brauer added that the findings underscore the importance for city planners to ensure they incorporate greenery and parks when planning and developing residential neighborhoods.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

You Are Now Remotely Controlled

NY Times

By Shoshana Zuboff

the author of “The Age of Surveillance Capitalism.”

 

 

 

 

 

 

 

“The data flows empty into surveillance capitalists’ computational factories, called artificial intelligence.A broad term used to describe a process in which machines learn from experience and potentially perform tasks previously done by humans. It is also a field of computer science dedicated to simulating intelligent behavior in computers: Glossary, where they are manufactured into behavioral predictions that are about us, but they are not for us. Instead, they are sold to business customers in a new kind of market that trades exclusively in human futures. Certainty in human affairs is the lifeblood of these markets, where surveillance capitalists compete on the quality of their predictions. This is a new form of trade that birthed some of the richest and most powerful companies in history.

In order to achieve their objectives, the leading surveillance capitalists sought to established unrivaled dominance over the 99.9 percent of the world’s information now rendered in digital formats that they helped to create. Surveillance capital has built most of the world’s largest computer networks, data centers, populations of servers, undersea transmission cables, advanced microchips, and frontier machine intelligence, igniting an arms race for the 10,000 or so specialists on the planet who know how to coax knowledge from these vast new data continents.

With Google in the lead, the top surveillance capitalists seek to control labor markets in critical expertise, including data science and animal research, elbowing out competitors such as start-ups, universities, high schools, municipalities, established corporations in other industries and less wealthy countries. In 2016, 57 percent of American computer science Ph.D. graduates took jobs in industry, while only 11 percent became tenure-track faculty members. It’s not just an American problem. In Britain, university administrators contemplate a missing generation” of data scientists. A Canadian scientist laments, “the power, the expertise, the data are all concentrated in the hands of a few companies.”

Google created the first insanely lucrative markets to trade in human futures, what we now know as online targeted advertising, based on their predictions of which ads users would click. Between 2000, when the new economic logic was just emerging, and 2004, when the company went public, revenues increased by 3,590 percent. This startling number represents the “surveillance dividend.” It quickly reset the bar for investors, eventually driving start-ups, apps developers and established companies to shift their business models toward surveillance capitalism. The promise of a fast track to outsized revenues from selling human futures drove this migration first to Facebook, then through the tech sector and now throughout the rest of the economy to industries as disparate as insurance, retail, finance, education, health care, real estate, entertainment and every product that begins with the word “smart” or service touted as “personalized.”

Unequal knowledge about us produces unequal power over us, and so epistemic inequality widens to include the distance between what we can do and what can be done to us. Data scientists describe this as the shift from monitoring to actuation, in which a critical mass of knowledge about a machine system enables the remote control of that system. Now people have become targets for remote control, as surveillance capitalists discovered that the most predictive data come from intervening in behavior to tune, herd and modify action in the direction of commercial objectives. This third imperative, “economies of action,” has become an arena of intense experimentation. “We are learning how to write the music,” one scientist said, “and then we let the music make them dance.”

This new power “to make them dance” does not employ soldiers to threaten terror and murder. It arrives carrying a cappuccino, not a gun. It is a new “instrumentarian” power that works its will through the medium of ubiquitous digital instrumentation to manipulate subliminal cues, psychologically target communications, impose default choice architectures, trigger social comparison dynamics and levy rewards and punishments — all of it aimed at remotely tuning, herding and modifying human behavior in the direction of profitable outcomes and always engineered to preserve users’ ignorance.

We saw predictive knowledge morphing into instrumentarian power in Facebook’s contagion experiments published in 2012 and 2014, when it planted subliminal cues and manipulated social comparisons on its pages, first to influence users to vote in midterm elections and later to make people feel sadder or happier. Facebook researchers celebrated the success of these experiments noting two key findings: that it was possible to manipulate online cues to influence real world behavior and feelings, and that this could be accomplished while successfully bypassing users’ awareness.

In 2016, the Google-incubated augmented reality game, Pokémon Go, tested economies of action on the streets. Game players did not know that they were pawns in the real game of behavior modification for profit, as the rewards and punishments of hunting imaginary creatures were used to herd people to the McDonald’s, Starbucks and local pizza joints that were paying the company for footfall in exactly the same way that online advertisers pay for “click through” to their websites.

In 2017, a leaked Facebook document acquired by The Australian exposed the corporation’s interest in applying “psychological insights” from “internal Facebook data” to modify user behavior. The targets were 6.4 million young Australians and New Zealanders. “By monitoring posts, pictures, interactions and internet activity in real time,” the executives wrote, “Facebook can work out when young people feel ‘stressed,’ ‘defeated,’ ‘overwhelmed,’ ‘anxious,’ ‘nervous,’ ‘stupid,’ ‘silly,’ ‘useless’ and a ‘failure.’” This depth of information, they explained, allows Facebook to pinpoint the time frame during which a young person needs a “confidence boost” and is most vulnerable to a specific configuration of subliminal cues and triggers. The data are then used to match each emotional phase with appropriate ad messaging for the maximum probability of guaranteed sales.

Facebook denied these practices, though a former product manager accused the company of “lying through its teeth.” The fact is that in the absence of corporate transparency

Taking appropriate measures to provide any information relating to processing to the data subject in a concise, intelligible and easily accessible form, using clear and plain language.

 

Glossary and democratic oversight, epistemic inequality rules. They know. They decide who knows. They decide who decides.

The public’s intolerable knowledge disadvantage is deepened by surveillance capitalists’ perfection of mass communications as gaslighting. Two examples are illustrative. On April 30, 2019 Mark Zuckerberg made a dramatic announcement at the company’s annual developer conference, declaring, “The future is private.” A few weeks later, a Facebook litigator appeared before a federal district judge in California to thwart a user lawsuit over privacy invasion, arguing that the very act of using Facebook negates any reasonable expectation of privacy “as a matter of law.” In May 2019 Sundar Pichai, chief executive of Google, wrote in The Times of his corporations’s commitment to the principle that “privacy cannot be a luxury good, but five months later Google contractors were found offering $5 gift cards to homeless people of color in an Atlanta park in return for a facial scan.

Facebook’s denial invites even more scrutiny in light of another leaked company document appearing in 2018. The confidential report offers rare insight into the heart of Facebook’s computational factory, where a “prediction engine” runs on a machine intelligence platform that “ingests trillions of data points every day, trains thousands of models” and then “deploys them to the server fleet for live predictions.” Facebook notes that its “prediction service” produces “more than 6 million predictions per second.” But to what purpose?

In its report, the company makes clear that these extraordinary capabilities are dedicated to meeting its corporate customers’ “core business challenges” with procedures that link prediction, microtargeting, intervention and behavior modification. For example, a Facebook service called “loyalty prediction” is touted for its ability to plumb proprietary behavioral surplus to predict individuals who are “at risk” of shifting their brand allegiance and alerting advertisers to intervene promptly with targeted messages designed to stabilize loyalty just in time to alter the course of the future.

That year a young man named Christopher Wylie turned whistle-blower on his former employer, a political consultancy known as Cambridge Analytica. “We exploited Facebook to harvest millions of people’s profiles,” Wylie admitted, and built Models to exploit what we knew about them and target their inner demons.” Mr. Wylie characterized those techniques as “information warfare, correctly assessing that such shadow wars are built on asymmetries of knowledge and the power it affords. Less clear to the public or lawmakers was that the political firm’s strategies of secret invasion and conquest employed surveillance capitalism’s standard operating procedures to which billions of innocent “users” are routinely subjected each day. Mr. Wylie described this mirroring process, as he followed a trail that was already cut and marked. Cambridge Analytica’s real innovation was to pivot the whole undertaking from commercial to political objectives.

In other words, Cambridge Analytica was the parasite, and surveillance capitalism was the host. Thanks to its epistemic dominance, surveillance capitalism provided the behavioral data that exposed the targets for assault. Its methods of behavioral microtargeting and behavioral modification became the weapons. And it was surveillance capitalism’s lack of accountability for content on its platform afforded by Section 230 that provided the opportunity for the stealth attacks designed to trigger the inner demons of unsuspecting citizens.

It’s not just that epistemic inequality leaves us utterly vulnerable to the attacks of actors like Cambridge Analytica. The larger and more disturbing point is that surveillance capitalism has turned epistemic inequality into a defining condition of our societies, normalizing information warfare as a chronic feature of our daily reality prosecuted by the very corporations upon which we depend for effective social participation. They have the knowledge, the machines, the science and the scientists, the secrets and the lies. All privacy now rests with them, leaving us with few means of defense from these marauding data invaders. Without law, we scramble to hide in our own lives, while our children debate encryption

The process of obscuring information, often through the use of a cryptographic scheme, in order to make the information unreadable without special knowledge.

 

Glossary strategies around the dinner table and students wear masks to public protests as protection from facial recognition systems built with our family photos.

In the absence of new declarations of epistemic rights and legislation, surveillance capitalism threatens to remake society as it unmakes democracy. From below, it undermines human agency, usurping privacy, diminishing autonomy and depriving individuals of the right to combat. From above, epistemic inequality and injustice are fundamentally incompatible with the aspirations of a democratic people.

We know that surveillance capitalists work in the shadows, but what they do there and the knowledge they accrue are unknown to us. They have the means to know everything about us, but we can know little about them. Their knowledge of us is not for us. Instead, our futures are sold for others’ profits. Since that Federal Trade Commission meeting in 1997, the line was never drawn, and people did become chattel for commerce. Another destructive delusion is that this outcome was inevitable — an unavoidable consequence of convenience-enhancing digital technologies. The truth is that surveillance capitalism hijacked the digital medium. There was nothing inevitable about it.

American lawmakers have been reluctant to take on these challenges for many reasons. One is an unwritten policy of “surveillance exceptionalism” forged in the aftermath of the Sept. 11 terrorist attacks, when the government’s concerns shifted from online privacy protections to a new zeal for “total information awareness. In that political environment the fledgling surveillance capabilities emerging from Silicon Valley appeared to hold great promise.

Surveillance capitalists have also defended themselves with lobbying and forms of propaganda intended to undermine and intimidate lawmakers, confounding judgment and freezing action. These have received relatively little scrutiny compared to the damage they do. Consider two examples:

The first is the assertion that democracy threatens prosperity and innovation. Former Google chief executive Eric Schmidt explained in 2011, “we took the position of ‘hands off the internet.’ You know, leave us alone … The government can make regulatory mistakes that can slow this whole thing down, and we see that and we worry about it.” This propaganda is recycled from the Gilded Age barons, whom we now call “robbers.” They insisted that there was no need for law when one had the “law of survival of the fittest,” the “laws of capital” and the “law of supply and demand.”

Paradoxically, surveillance capital does not appear to drive innovation. A promising new era of economic research shows the critical role that government and democratic governance have played in innovation and suggests a lack of innovation in big tech companies like Google. Surveillance capitalism’s information dominance is not dedicated to the urgent challenges of carbon-free energy, eliminating hunger, curing cancers, ridding the oceans of plastic or flooding the world with well paid, smart, loving teachers and doctors. Instead, we see a frontier operation run by geniuses with vast capital and computational power that is furiously dedicated to the lucrative science and economics of human prediction for profit.

The second form of propaganda is the argument that the success of the leading surveillance capitalist firms reflects the real value they bring to people. But data from the demand side suggest that surveillance capitalism is better understood as a market failure. Instead of a close alignment of supply and demand, people use these services because they have no comparable alternatives and because they are ignorant of surveillance capitalism’s shadow operations and their consequences. Pew Research Center recently reported that 81 percent of Americans believe the potential risks of companies’ data collection outweigh the benefits, suggesting that corporate success depends upon coercion and obfuscation rather than meeting peoples’ real needs.

In his prizewinning history of regulation, the historian Thomas McCraw delivers a warning. Across the centuries regulators failed when they did not frame “strategies appropriate to the particular industries they were regulating.” Existing privacy and antitrust laws are vital but neither will be wholly adequate to the new challenges of reversing epistemic inequality.

These contests of the 21st century demand a framework of epistemic rights enshrined in law and subject to democratic governance. Such rights would interrupt data supply chains by safeguarding the boundaries of human experience before they come under assault from the forces of datafication. The choice to turn any aspect of one’s life into data must belong to individuals by virtue of their rights in a democratic society. This means, for example, that companies cannot claim the right to your face, or use your face as free raw material for analysis, or own and sell any computational products that derive from your face.

On the demand side, we can outlaw human futures markets and thus eliminate the financial incentives that sustain the surveillance dividend. This is not a radical prospect. For example, societies outlaw markets that trade in human organs, babies and slaves. In each case, we recognize that such markets are both morally repugnant and produce predictably violent consequences. Human futures markets can be shown to produce equally predictable outcomes that challenge human freedom and undermine democracy. Like subprime mortgages and fossil fuel investments, surveillance assets will become the new toxic assets.

In support of a new competitive landscape, lawmakers will need to champion new forms of collective action, just as nearly a century ago legal protections for the rights to organize, to strike and to bargain collectively united lawmakers and workers in curbing the powers of monopoly capitalists. Lawmakers must seek alliances with citizens who are deeply concerned over the unchecked power of the surveillance capitalists and with workers who seek fair wages and reasonable security in defiance of the precarious employment conditions that define the surveillance economy.

Anything made by humans can be unmade by humans. Surveillance capitalism is young, barely 20 years in the making, but democracy is old, rooted in generations of hope and contest.

Surveillance capitalists are rich and powerful, but they are not invulnerable. They have an Achilles heel: fear. They fear lawmakers who do not fear them. They fear citizens who demand a new road forward as they insist on new answers to old questions: Who will know? Who will decide who knows? Who will decide who decides? Who will write the music, and who will dance?”

NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

January 2020:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Air Pollution, Evolution, and the Fate of Billions of Humans

 

 

 

 

 

 

 

 

 

 

It’s not just a modern problem. Airborne toxins are so pernicious that they may have shaped our DNA over millions of years.

“The threat of air pollution grabs our attention when we see it — for example, the tendrils of smoke of Australian brush fires, now visible from space, or the poisonous soup of smog that descends on cities like New Delhi in the winter.

But polluted air also harms billions of people on a continuing basis. Outdoors, we breathe in toxins delivered by car traffic, coal-fired plants and oil refineries. Indoor fires for heat and cooking taint the air for billions of people in poor countries. Over a billion people add toxins to their lungs by smoking cigarettes- more recently, by vaping.

Ninety-two percent of the world’s people live in places where fine particulate matter — the very small particles most dangerous to human tissues — exceeds the World Health Organization’s guideline for healthy air. Air pollution and tobacco together are responsible for up to 20 million premature deaths each year.

Airborne toxins damage us in a staggering number of ways. Along with well-established links to lung cancer and heart disease, researchers are now finding new connections to disorders such as diabetes and Alzheimer’s disease.

Scientists are still figuring out how air pollution causes these ailments. They are also puzzling over the apparent resilience that some people have to this modern onslaught.

Some researchers now argue that the answers to these questions lie in our distant evolutionary past, millions of years before the first cigarette was lit and the first car hit the road. Our ancestors were bedeviled by airborne toxins even as bipedal apes walking the African savanna, argued Benjamin Trumble, a biologist at Arizona State University, and Caleb Finch of the University of Southern California, in the December issue of the Quarterly Review of Biology. But our evolutionary legacy may also be a burden, Dr. Trumble and Dr. Finch speculated. Some genetic adaptations may have increased our vulnerability to diseases linked to air pollution.

It is “a really creative, interesting contribution to evolutionary medicine,” said Molly Fox, an anthropologist at the University of California, Los Angeles, who was not involved in the new study.

The story begins about seven million years ago. Africa at the time was gradually growing more arid. The Sahara emerged in northern Africa, while grasslands opened up in eastern and southern Africa.

The ancestors of chimpanzees and gorillas remained in the retreating forests, but our ancient relatives adapted to the new environments. They evolved into a tall, slender frame well suited to walking and running long distances.

Dr. Finch and Dr. Trumble believe that early humans faced another challenge that has gone largely overlooked: the air.

Periodically, the savanna would have experienced heavy dust storms from the Sahara, and our distant ancestors may have risked harm to their lungs from breathing in the silica-rich particles.

“When the dust is up, we’re going to see more pulmonary problems,” Dr. Finch said. Even today, Greek researchers have found that when Sahara winds reach their country, patients surge into hospitals with respiratory complaints.

The dense foliage of tropical forests gave chimpanzees and gorillas a refuge from dust. But the earliest humans, wandering the open grasslands, had nowhere to hide.

Dust was not the only hazard. The lungs of early humans also may have been irritated by the high levels of pollen and particles of fecal matter produced by the savanna’s vast herds of grazing animals.

Dr. Finch and Dr. Trumble maintain that scientists should consider whether these new challenges altered our biology through natural selection. Is it possible, for instance, that people who are resilient to cigarette smoke have inherited genetic variants that protected their distant ancestors from cave fires?

One way to answer these questions is to look at genes that have evolved significantly since our ancestors moved out of the forests.

One of them is MARCO, which provides the blueprint for production of a molecular hook used by immune cells in our lungs. The cells use this hook to clear away both bacteria and particles, including silica dust.

Later, our ancestors added to airborne threats by mastering fire. As they lingered near hearths to cook food, stay warm or keep away from insects, they breathed in smoke. Once early humans began building shelters, the environment became more harmful to their lungs.

“Most traditional people live in a highly smoky environment,” Dr. Finch said. “I think it has been a fact of human living for us even before our species.”

Smoke created a new evolutionary pressure, he and Dr. Trumble believe. Humans evolved powerful liver enzymes, for example, to break down toxins passing into the bloodstream from the lungs.

Gary Perdew, a molecular toxicologist at Penn State University, and his colleagues have found evidence of smoke-driven evolution in another gene, AHR.

This gene makes a protein found on cells in the gut, lungs and skin. When toxins get snagged on the protein, cells release enzymes that break down the poisons.

Other mammals use AHR to detoxify their food. But the protein is also effective against some of the compounds in wood smoke.

Compared to other species, the human version produces a weaker response to toxins, perhaps because AHR protein is not the perfect protector — the fragments it leaves behind can cause tissue damage.

Clean water, improved medicines and other innovations drastically reduced deaths from infectious diseases. The average life expectancy shot up. But our exposure to airborne toxins also increased.

“If we compressed the last five million years into a single year, it wouldn’t be until Dec. 31, 11:40 p.m., that the Industrial Revolution begins,” Dr. Trumble said. “We are living in just the tiniest little blip of human existence, yet we think everything around us is what’s normal.”

The Industrial Revolution was powered largely by coal, and people began breathing the fumes. Cars became ubiquitous; power plants and oil refineries spread. Tobacco companies made cigarettes on an industrial scale. Today, they sell 6.5 trillion cigarettes every year.

Our bodies responded with defenses honed over hundreds of thousands of years. One of their most potent responses was inflammation. But instead of brief bursts of inflammation, many people began to experience it constantly.

Many studies now suggest that chronic inflammation represents an important link between airborne toxins and disease. In the brain, for example, chronic inflammation may impair our ability to clear up defective proteins. As those proteins accumulate, they may lead to dementia.

Pathogens can hitch a ride on particles of pollutants. When they get in our noses, they can make contact with nerve endings. There, they can trigger even more inflammation.

“They provide this highway that’s a direct route to the brain,” Dr. Fox, of the University of California, Los Angeles, said. “I think that’s what makes this a particularly scary story.”

Some genetic variants that arose in our smoky past may offer some help now. They might allow some people to live long despite smoking, Dr. Finch and Dr. Trumble suggest.

But the researchers have studied another gene for which the opposite seems to be true: a variant that was once helpful has become harmful in an age of rising air pollution.

More recently, researchers have also discovered that ApoE4 increases the risk that exposure to because it drastically raises the risk of developing Alzheimer’s disease. The air pollution leads to dementia.

But these studies were restricted to industrialized countries. When researchers looked to other societies — such as farmers in poor villages in Ghana, or indigenous forest dwellers in Bolivia- ApoE4 had a very different effect. In these societies, infectious diseases remain a major cause of death, especially in children. Researchers have found that in such places, ApoE4 increases the odds that people will survive to adulthood and have children.

Natural selection may have favored ApoE4 for hundreds of thousands of years because of this ability to increase survival. But this gene and others may have had harmful side effects that remained invisible until the sooty, smoky modern age.”

NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Food textures affect perceptions of healthiness

 

 

 

 

 

 

 

“New research has demonstrated how food producers could change the surface texture of products to change people’s perceptions and promote healthy eating.

The study, led by Consumer Psychologist Dr Cathrine Jansson-Boyd of Anglia Ruskin University (ARU), investigated people’s perceptions of identical biscuits with six different textures.

Published in the journal Food Quality and Preference, the research involved 88 people rating the six oat biscuits on healthiness, tastiness, crunchiness, chewiness, pleasantness and likelihood of purchase based only on their visual appearance, not on their taste or touch.

Previous studies have shown that packaging, labelling and even the texture of a cup or plate can alter people’s perception of food. This new study looked at how a food product itself can be perceived differently depending on its appearance.

Oat biscuits were chosen as they can represent both a “healthy” and “unhealthy” snack. The research found that the surface texture of the oat biscuit clearly communicated to people how healthy it was likely to be and the participants viewed the biscuits that had an explicit, pronounced texture, as healthier.

However, the biscuits that had a less explicitly textured surface were perceived to be tastier, crunchier and more likely to be purchased. The study found that perceived tastiness increases as healthiness decreases, and the likelihood of purchasing the biscuit increases when perceived healthiness is low and decreases when healthiness is higher.

Therefore having a ‘healthy looking’ texture is considered to be a negative attribute in that it reduces perceived tastiness, a key criteria for purchasing biscuits. This has implications for producers of many different food types.

Dr Jansson-Boyd, Reader in Psychology at Anglia Ruskin University (ARU), said: “The findings are really exciting as they give food manufacturers a means to design foods that can help consumers make healthier choices.

A sweet item, such as a biscuit, benefits from having an appearance as being less healthy as that increases the perception of tastiness and increases the likelihood of purchase. To guide healthier purchasing decisions, food producers can therefore look to use non-healthy looking, smoother textures to overcome this perception that healthy is not tasty.

“At a time when the World Health Organisation has declared that there is an obesity epidemic, it is essential to think of ways to encourage improved eating patterns. Our research provides a good starting point in how to promote healthier food products.” Neuroscience Journal

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Microsoft Pushes Cloud Services to Retailers Anxious to Avoid Amazon

 

 

 

 

 

 

 

“Microsoft Corp. is unveiling new cloud tools designed for retail customers, seeking to position itself as an alternative to Amazon and corporate software companies like Slack Technologies Inc. and Salesforce.

Microsoft is adding a feature to its Slack rival, the Teams corporate chat program, that lets store workers push a button to turn their mobile phones into walkie-talkies for in-store communications. In a speech on Jan 12 at a retail industry event, Microsoft Chief Executive Officer plans to discuss how Ikea shifted more than 70,000 workers to Teams, using the service for meetings and chat. The home furnishing giant’s largest store, in Stockholm, also started using a scheduling feature to manage the shifts of 150 restaurant staffers.

Ikea is also working with Microsoft to determine if Teams can play a role in its “store of the future” concepts. The Swedish company may put video screens in stores that use Teams to connect customers with kitchen design advisers, said Kenneth Lindegaard, an Ikea vice president. The company plans to have the rest of its 165,000-person workforce on Office 365 cloud software and Teams by the end of spring, although Ikea still has some smaller groups using Slack and Google’s G Suite, he said.  Ikea also uses Microsoft’s Azure and Google Cloud, he said.

The retail industry has been one of Microsoft’s most successful as the software maker tries to gain ground in cloud computing against market leader Amazon Web Services and lure more customers to its internet-based Office products. Some retailers are loath to work with e-commerce rival Amazon. Nadella and Google Cloud chief Thomas Kurian are set to speak next week at the annual show of the National Retail Federation, the biggest retail trade group, underscoring how significant the industry is to Amazon’s biggest cloud competitors. 

 “A key part of our offering is that we partner and we don’t compete,” said Bransten, the vice president who oversees Microsoft’s work with retailers and consumer goods companies. But there are other benefits to working closely with retailers, she said in an interview. Some of the software products built for retailers will be useful for companies in other industries.

For example, the walkie-talkie feature in Teams can help manufacturers, said Williams, a Microsoft vice president who is charged with adding features to Office and Teams for use by customers in health care, retail, manufacturing and finance.” Financial Post

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Will Science Ever Give Us a Better Night’s Sleep?

 

 

 

 

 

 

 

 

 

 

 

 

 

“We humans spend a third of our lives asleep, oblivious to our surroundings and temporarily paralyzed. It’s a vulnerability that would seem to diminish our odds of survival, so evolutionarily speaking it must also somehow confer tremendous benefits. Yet our best guesses about what those benefits are tend to come from observing what happens when sleep is curtailed. As far as we know, all animals sleep in some way; deprive most of them of it for long enough, and they will die, but exactly why is unclear. In 2015, the American Academy of Sleep Medicine and the Sleep Research Society published a joint statement, based on a comprehensive review of research, saying that “sleeping less than seven hours per night on a regular basis” — which is the case for an estimated 35 to 40 percent of Americans during the workweek — is associated with adverse health outcomes. These include weight gain and obesity, diabetes, hypertension, heart disease and stroke, depression, impaired immune function, increased pain, greater likelihood of accidents and “increased risk of death.” The National Institutes of Health reported last year that sleep deficits may increase the beta-amyloid proteins in the brain linked with Alzheimer’s disease. But when it comes to “what sleep is, how much you need and what it’s for,” says Louis Ptacek, a professor of neurology at the University of California, San Francisco, “we know almost nothing — other than it’s bad not to get enough of it.” Indeed, says David Dinges, one of the statement’s authors and a professor of psychiatry at the University of Pennsylvania, “All of this makes it really tough to send out simple messages to the public about when you should sleep and how much you should sleep.”

Scientists believe that there are two separate but interrelated internal systems that regulate sleep. The first is the circadian system that tells our body when to sleep. Medicine already knows a great deal about how it works: Approximately every 24 hours, the suprachiasmatic nucleus, a small region in the hypothalamus, orchestrates physiological changes to prepare us for sleep, like lowering body temperature and releasing dopamine. But the second system — the one that tells our body the amount of sleep it needs — is still mysterious. One way to elucidate it would be to find genes that govern how long or deeply people sleep and observe where those genes are active. This fall, Ptacek announced, in the journals Neuron and Science Translational medicine, the discovery of two genetic mutations that seem to cause certain people to sleep far less than average. This brought the number of genes known to be involved in sleep duration to just three.

To learn what genes underlie a given trait or behavior, researchers look for anomalies. If you sequenced the genes of any two random people, you would find millions of variations between them, and there would be no telling which variation contributed to what traits. But diseases or disorders have discrete and relatively uncommon collections of symptoms that, when seen in families, suggest a genetic cause; if you look at the genes of five relatives who all have a particular type of early-onset Alzheimer’s disease, for example, and find a mutation that they all share, it’s much likelier to be related to their symptoms. Scientists can then engineer the same mutation in mice to see if it produces a similar effect in them. If it does, by observing where the gene is active in the brain, researchers can begin to study the processes that give rise to that specific collection of symptoms; the hope is that those processes will also offer clues to what neural mechanisms initiate more prevalent forms of the disease. Ptacek began the career hunting for genes responsible for neurological diseases and disorders, including Alzheimer’s and epilepsy. Then a colleague, a sleep neurologist, approached them with an intriguing case: a woman whose circadian clock seemed to be set four hours early, which caused her to go to sleep around 7:30 p.m. and wake around 4:30 a.m. Even more intriguing, she came from a large family with many members who shared her unusual behavior. This seemed like a rare opportunity. Distinctive sleep traits can be hard to spot, because of all the ways we can manage or obscure them — for example, by altering our natural sleep rhythms with environmental inputs like lights, LED screens, alarm clocks and caffeine. And this wasn’t a debilitating disorder for which people would necessarily seek treatment. (The woman did so only because she didn’t like waking up when it was “cold, dark and lonely,” Ptacek says.) In 2001, the research group, then at the University of Utah, announced it had found a genetic mutation responsible for the woman and her relatives’ early rising. After their discovery generated headlines, they were contacted by others who believed they had the “morning lark gene.” By giving them detailed questionnaires to identify those whose early rising was innate, rather than environmentally influenced, the researchers happened upon a family with members who woke up early but did not also go to bed early — they averaged only 6.25 hours of sleep per night yet reported feeling fine afterward. Ptacek, by now at U.C.S.F., found that they shared a genetic mutation that caused their short sleep. The publication of that finding in 2009 inspired others who habitually sleep less than seven hours per night to contact the group, leading to the identification of the two additional mutations reported this fall, which appear to cause other forms of so-called naturally short sleep. When engineered in mice, those mutations truncated their sleep without any obvious ill effects; in particular, mice engineered with the second mutation performed just as well on memory tests as non-sleep-deprived mice. The naturally short-sleeping people they’ve met, the researchers wrote in Science Translational Medicine, seem especially healthy, too; they are also “optimistic, with high pain threshold.” (The second of the mutations alters a gene that has also been shown to facilitate learning and memory, reduce anxiety and block the detection of pain.) It’s possible, the researchers theorize, that the mutations that cause short sleep are also somehow “compensating for” or rendering people “impervious to” the negative health outcomes typically tied to sleep deficits. If researchers can understand how the mutations alter brain activity, “we can help everybody to sleep more efficiently” — possibly by formulating a drug that achieves the same effect. Because a lack of sleep is associated with such a wide range of negative outcomes, a sleep-optimizing drug could transform almost every aspect of human health. The question remains, though, how to tell whether sleep is being optimized if we don’t fully know its purpose. The “work” of sleep, whatever that is, may be getting done more efficiently in the short-sleepers than in the rest of us. Or they may be compensating for less sleep more effectively. “How do we know if you’ve accomplished the work?” says William J. Schwartz, a neurologist at Dell Medical School at the University of Texas at Austin. “At the moment, still, the routine clinical measure that you’re getting enough sleep is that you’re not sleepy. That’s not going to get us very far.” But it is a starting point, however crude, for identifying sleep anomalies, and characterizing more of them in finer detail has begun to reveal other potentially heritable traits. Sleep apnea, for instance, was once thought to be a structural problem with a person’s airway; it is now thought to encompass several subtypes, classified by subtly different symptoms throughout the body, some of which might be genetic, says Clete Kushida, medical director of the Stanford University Sleep Medicine Division. “In the next couple of years, there’s going to be more exploration and more findings that will show maybe some of the disorders we thought weren’t genetic might be genetic, and some that we thought were more homogeneous aren’t,” he says.” NY Times

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

News 2019