Shared posts

08 Feb 12:38

Using noise to shape motor learning

by Thorp, E. B., Kording, K. P., Mussa-Ivaldi, F. A.

Each of our movements is selected from any number of alternative movements. Some studies have shown evidence that the central nervous system (CNS) chooses to make the specific movements that are least affected by motor noise. Previous results showing that the CNS has a natural tendency to minimize the effects of noise make the direct prediction that if the relationship between movements and noise were to change, the specific movements people learn to make would also change in a predictable manner. Indeed, this has been shown for well-practiced movements such as reaching. Here, we artificially manipulated the relationship between movements and visuomotor noise by adding noise to a motor task in a novel redundant geometry such that there arose a single control policy that minimized the noise. This allowed us to see whether, for a novel motor task, people could learn the specific control policy that minimized noise or would need to employ other compensation strategies to overcome the added noise. As predicted, subjects were able to learn movements that were biased toward the specific ones that minimized the noise, suggesting not only that the CNS can learn to minimize the effects of noise in a novel motor task but also that artificial visuomotor noise can be a useful tool for teaching people to make specific movements. Using noise as a teaching signal promises to be useful for rehabilitative therapies and movement training with human-machine interfaces.

NEW & NOTEWORTHY Many theories argue that we choose to make the specific movements that minimize motor noise. Here, by changing the relationship between movements and noise, we show that people actively learn to make movements that minimize noise. This not only provides direct evidence for the theories of noise minimization but presents a way to use noise to teach specific movements to improve rehabilitative therapies and human-machine interface control.

04 Feb 02:13

Information Geometry (Part 16)

by John Baez

This week I’m giving a talk on biology and information:

• John Baez, Biology as information dynamics, talk for Biological Complexity: Can it be Quantified?, a workshop at the Beyond Center, 2 February 2017.

While preparing this talk, I discovered a cool fact. I doubt it’s new, but I haven’t exactly seen it elsewhere. I came up with it while trying to give a precise and general statement of ‘Fisher’s fundamental theorem of natural selection’. I won’t start by explaining that theorem, since my version looks rather different than Fisher’s, and I came up with mine precisely because I had trouble understanding his. I’ll say a bit more about this at the end.

Here’s my version:

The square of the rate at which a population learns information is the variance of its fitness.

This is a nice advertisement for the virtues of diversity: more variance means faster learning. But it requires some explanation!

The setup

Let’s start by assuming we have n different kinds of self-replicating entities with populations P_1, \dots, P_n. As usual, these could be all sorts of things:

• molecules of different chemicals
• organisms belonging to different species
• genes of different alleles
• restaurants belonging to different chains
• people with different beliefs
• game-players with different strategies
• etc.

I’ll call them replicators of different species.

Let’s suppose each population P_i is a function of time that grows at a rate equal to this population times its ‘fitness’. I explained the resulting equation back in Part 9, but it’s pretty simple:

\displaystyle{ \frac{d}{d t} P_i(t) = f_i(P_1(t), \dots, P_n(t)) \, P_i(t)   }

Here f_i is a completely arbitrary smooth function of all the populations! We call it the fitness of the ith species.

This equation is important, so we want a short way to write it. I’ll often write f_i(P_1(t), \dots, P_n(t)) simply as f_i, and P_i(t) simply as P_i. With these abbreviations, which any red-blooded physicist would take for granted, our equation becomes simply this:

\displaystyle{ \frac{dP_i}{d t}  = f_i \, P_i   }

Next, let p_i(t) be the probability that a randomly chosen organism is of the ith species:

\displaystyle{ p_i(t) = \frac{P_i(t)}{\sum_j P_j(t)} }

Starting from our equation describing how the populations evolve, we can figure out how these probabilities evolve. The answer is called the replicator equation:

\displaystyle{ \frac{d}{d t} p_i(t)  = ( f_i - \langle f \rangle ) \, p_i(t) }

Here \langle f \rangle is the average fitness of all the replicators, or mean fitness:

\displaystyle{ \langle f \rangle = \sum_j f_j(P_1(t), \dots, P_n(t)) \, p_j(t)  }

In what follows I’ll abbreviate the replicator equation as follows:

\displaystyle{ \frac{dp_i}{d t}  = ( f_i - \langle f \rangle ) \, p_i }

The result

Okay, now let’s figure out how fast the probability distribution

p(t) = (p_1(t), \dots, p_n(t))

changes with time. For this we need to choose a way to measure the length of the vector

\displaystyle{  \frac{dp}{dt} = (\frac{d}{dt} p_1(t), \dots, \frac{d}{dt} p_n(t)) }

And here information geometry comes to the rescue! We can use the Fisher information metric, which is a Riemannian metric on the space of probability distributions.

I’ve talked about the Fisher information metric in many ways in this series. The most important fact is that as a probability distribution p(t) changes with time, its speed

\displaystyle{  \left\| \frac{dp}{dt} \right\|}

as measured using the Fisher information metric can be seen as the rate at which information is learned. I’ll explain that later. Right now I just want a simple formula for the Fisher information metric. Suppose v and w are two tangent vectors to the point p in the space of probability distributions. Then the Fisher information metric is given as follows:

\displaystyle{ \langle v, w \rangle = \sum_i \frac{1}{p_i} \, v_i w_i }

Using this we can calculate the speed at which p(t) moves when it obeys the replicator equation. Actually the square of the speed is simpler:

\begin{array}{ccl}  \displaystyle{ \left\| \frac{dp}{dt}  \right\|^2 } &=& \displaystyle{ \sum_i \frac{1}{p_i} \left( \frac{dp_i}{dt} \right)^2 } \\ \\  &=& \displaystyle{ \sum_i \frac{1}{p_i} \left( ( f_i - \langle f \rangle ) \, p_i \right)^2 } \\ \\  &=& \displaystyle{ \sum_i  ( f_i - \langle f \rangle )^2 p_i }   \end{array}

The answer has a nice meaning, too! It’s just the variance of the fitness: that is, the square of its standard deviation.

So, if you’re willing to buy my claim that the speed \|dp/dt\| is the rate at which our population learns new information, then we’ve seen that the square of the rate at which a population learns information is the variance of its fitness!

Fisher’s fundamental theorem

Now, how is this related to Fisher’s fundamental theorem of natural selection? First of all, what is Fisher’s fundamental theorem? Here’s what Wikipedia says about it:

It uses some mathematical notation but is not a theorem in the mathematical sense.

It states:

“The rate of increase in fitness of any organism at any time is equal to its genetic variance in fitness at that time.”

Or in more modern terminology:

“The rate of increase in the mean fitness of any organism at any time ascribable to natural selection acting through changes in gene frequencies is exactly equal to its genetic variance in fitness at that time”.

Largely as a result of Fisher’s feud with the American geneticist Sewall Wright about adaptive landscapes, the theorem was widely misunderstood to mean that the average fitness of a population would always increase, even though models showed this not to be the case. In 1972, George R. Price showed that Fisher’s theorem was indeed correct (and that Fisher’s proof was also correct, given a typo or two), but did not find it to be of great significance. The sophistication that Price pointed out, and that had made understanding difficult, is that the theorem gives a formula for part of the change in gene frequency, and not for all of it. This is a part that can be said to be due to natural selection

Price’s paper is here:

• George R. Price, Fisher’s ‘fundamental theorem’ made clear, Annals of Human Genetics 36 (1972), 129–140.

I don’t find it very clear, perhaps because I didn’t spend enough time on it. But I think I get the idea.

My result is a theorem in the mathematical sense, though quite an easy one. I assume a population distribution evolves according to the replicator equation and derive an equation whose right-hand side matches that of Fisher’s original equation: the variance of the fitness.

But my left-hand side is different: it’s the square of the speed of the corresponding probability distribution, where speed is measured using the ‘Fisher information metric’. This metric was discovered by the same guy, Ronald Fisher, but I don’t think he used it in his work on the fundamental theorem!

Something a bit similar to my statement appears as Theorem 2 of this paper:

• Marc Harper, Information geometry and evolutionary game theory.

and for that theorem he cites:

• Josef Hofbauer and Karl Sigmund, Evolutionary Games and Population Dynamics, Cambridge University Press, Cambridge, 1998.

However, his Theorem 2 really concerns the rate of increase of fitness, like Fisher’s fundamental theorem. Moreover, he assumes that the probability distribution p(t) flows along the gradient of a function, and I’m not assuming that. Indeed, my version applies to situations where the probability distribution moves round and round in periodic orbits!

Relative information and the Fisher information metric

The key to generalizing Fisher’s fundamental theorem is thus to focus on the speed at which p(t) moves, rather than the increase in fitness. Why do I call this speed the ‘rate at which the population learns information’? It’s because we’re measuring this speed using the Fisher information metric, which is closely connected to relative information, also known as relative entropy or the Kullback–Leibler divergence.

I explained this back in Part 7, but that explanation seems hopelessly technical to me now, so here’s a faster one, which I created while preparing my talk.

The information of a probability distribution q relative to a probability distribution p is

\displaystyle{     I(q,p) = \sum_{i =1}^n q_i \log\left(\frac{q_i}{p_i}\right) }

It says how much information you learn if you start with a hypothesis p saying that the probability of the ith situation was p_i, and then update this to a new hypothesis q.

Now suppose you have a hypothesis that’s changing with time in a smooth way, given by a time-dependent probability p(t). Then a calculation shows that

\displaystyle{ \left.\frac{d}{dt} I(p(t),p(t_0)) \right|_{t = t_0} = 0 }

for all times t_0. This seems paradoxical at first. I like to jokingly put it this way:

To first order, you’re never learning anything.

However, as long as the velocity \frac{d}{dt}p(t_0) is nonzero, we have

\displaystyle{ \left.\frac{d^2}{dt^2} I(p(t),p(t_0)) \right|_{t = t_0} > 0 }

so we can say

To second order, you’re always learning something… unless your opinions are fixed.

This lets us define a ‘rate of learning’—that is, a ‘speed’ at which the probability distribution p(t) moves. And this is precisely the speed given by the Fisher information metric!

In other words:

\displaystyle{ \left\|\frac{dp}{dt}(t_0)\right\|^2 =  \left.\frac{d^2}{dt^2} I(p(t),p(t_0)) \right|_{t = t_0} }

where the length is given by Fisher information metric. Indeed, this formula can be used to define the Fisher information metric. From this definition we can easily work out the concrete formula I gave earlier.

In summary: as a probability distribution moves around, the relative information between the new probability distribution and the original one grows approximately as the square of time, not linearly. So, to talk about a ‘rate at which information is learned’, we need to use the above formula, involving a second time derivative. This rate is just the speed at which the probability distribution moves, measured using the Fisher information metric. And when we have a probability distribution describing how many replicators are of different species, and it’s evolving according to the replicator equation, this speed is also just the variance of the fitness!


04 Feb 02:04

Dermatologist-level classification of skin cancer with deep neural networks

by Andre Esteva

Dermatologist-level classification of skin cancer with deep neural networks

Nature 542, 7639 (2017). doi:10.1038/nature21056

Authors: Andre Esteva, Brett Kuprel, Roberto A. Novoa, Justin Ko, Susan M. Swetter, Helen M. Blau & Sebastian Thrun

Skin cancer, the most common human malignancy, is primarily diagnosed visually, beginning with an initial clinical screening and followed potentially by dermoscopic analysis, a biopsy and histopathological examination. Automated classification of skin lesions using images is a challenging task owing to the fine-grained variability in the appearance of skin lesions. Deep convolutional neural networks (CNNs) show potential for general and highly variable tasks across many fine-grained object categories. Here we demonstrate classification of skin lesions using a single CNN, trained end-to-end from images directly, using only pixels and disease labels as inputs. We train a CNN using a dataset of 129,450 clinical images—two orders of magnitude larger than previous datasets—consisting of 2,032 different diseases. We test its performance against 21 board-certified dermatologists on biopsy-proven clinical images with two critical binary classification use cases: keratinocyte carcinomas versus benign seborrheic keratoses; and malignant melanomas versus benign nevi. The first case represents the identification of the most common cancers, the second represents the identification of the deadliest skin cancer. The CNN achieves performance on par with all tested experts across both tasks, demonstrating an artificial intelligence capable of classifying skin cancer with a level of competence comparable to dermatologists. Outfitted with deep neural networks, mobile devices can potentially extend the reach of dermatologists outside of the clinic. It is projected that 6.3 billion smartphone subscriptions will exist by the year 2021 (ref. 13) and can therefore potentially provide low-cost universal access to vital diagnostic care.

04 Feb 00:34

Aleppo

by Minnesotastan

Evocative.  Res ipsa loquitur.

Photo credit Mohammed Al-Khatieb/AFP/Getty Images, via The Huffington Post.
03 Feb 23:02

QCD-Aware Recursive Neural Networks for Jet Physics. (arXiv:1702.00748v2 [hep-ph] UPDATED)

by Gilles Louppe, Kyunghyun Cho, Cyril Becot, Kyle Cranmer

Recent progress in applying machine learning for jet physics has been built upon an analogy between calorimeters and images. In this work, we present a novel class of recursive neural networks built instead upon an analogy between QCD and natural languages. In the analogy, four-momenta are like words and the clustering history of sequential recombination jet algorithms is like the parsing of a sentence. Our approach works directly with the four-momenta of a variable-length set of particles, and the jet-based tree structure varies on an event-by-event basis. Our experiments highlight the flexibility of our method for building task-specific jet embeddings and show that recursive architectures are significantly more accurate and data efficient than previous image-based networks. We extend the analogy from individual jets (sentences) to full events (paragraphs), and show for the first time an event-level classifier operating on all the stable particles produced in an LHC event.

31 Jan 20:04

Biology as Information Dynamics

by John Baez

This is my talk for the workshop Biological Complexity: Can It Be Quantified?

• John Baez, Biology as information dynamics, 2 February 2017.

Abstract. If biology is the study of self-replicating entities, and we want to understand the role of information, it makes sense to see how information theory is connected to the ‘replicator equation’—a simple model of population dynamics for self-replicating entities. The relevant concept of information turns out to be the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Using this we can get a new outlook on free energy, see evolution as a learning process, and give a clean general formulation of Fisher’s fundamental theorem of natural selection.

For more, read:

• Marc Harper, The replicator equation as an inference dynamic.

• Marc Harper, Information geometry and evolutionary game theory.

• Barry Sinervo and Curt M. Lively, The rock-paper-scissors game and the evolution of alternative male strategies, Nature 380 (1996), 240–243.

• John Baez, Diversity, entropy and thermodynamics.

• John Baez, Information geometry.

The last reference contains proofs of the equations shown in red in my slides.
In particular, Part 16 contains a proof of my updated version of Fisher’s fundamental theorem.


27 Jan 21:29

Pregnancy leads to long-lasting changes in human brain structure

by Elseline Hoekzema
Nosimpler

Wouldn't it?

Nature Neuroscience 20, 287 (2017). doi:10.1038/nn.4458

Authors: Elseline Hoekzema, Erika Barba-Müller, Cristina Pozzobon, Marisol Picado, Florencio Lucco, David García-García, Juan Carlos Soliva, Adolf Tobeña, Manuel Desco, Eveline A Crone, Agustín Ballesteros, Susanna Carmona & Oscar Vilarroya

25 Jan 16:30

Papers Written While Drunk

by leinster
MathML-enabled post (click for more details).

I’m currently reading a preprint by a deservedly very well-respected and highly-reputed mathematician. It’s enjoyable, inspirational, and wonderful. The ideas that it expresses have been haunting and taunting me for years.

For various reasons, I have the impression that it was not wholly written while the author was wholly sober. That’s OK; I’ll judge the paper for what it is, not on how it was written. But it leads me to wonder: how common is this? In literature, it’s a well-established tradition to the point of cliché. For instance, here’s Ernest Hemingway —

Hemingway in Cuba

— giving a cocktail recipe for difficult political times (1937), “to be enjoyed from 11:00am on”. You can find countless examples of fiction writers enthusing about chemically-assisted escape from the so-called real world.

But mathematics prides itself on sharpness and precision in counterpoint to creativity. We love to say that we’re more creative than poets, but a piece of mathematics is in deep trouble if it’s logically wrong. So where does drugged, drunk or hallucinatory mathematics fit into our mathematicians’ culture?

25 Jan 00:15

Lavabit, Snowden’s Favorite Encrypted Email Service, Returns from the Dead

by Scott Shackford

Ladar LevisonEmail service provider Lavabit famously (in tech security circles anyway) shut its doors and turned itself off back in 2013. Its owner, Ladar Levison, explained that he was doing so to keep from having to comply with federal government orders to hand over the encryption key that would give the feds access to the contents of emails by domestic surveillance whistleblower Edward Snowden.

Now, as a new administration takes control of the White House, Levison and Lavabit are returning. Lavabit is relaunching its services, now that Levison has worked to make it even harder for the federal government to attempt to gain access to emails sent by its users. On his announcement, timed to launch with Donald Trump's inauguration, Levison explained that he had developed an end-to-end encryption system that would minimize the ability to for outsiders to access users info, once it's all fully implemented.

Kim Zetter over at The Intercept has more details directly from Lavabit:

With the new architecture, Lavabit will no longer be able to hand over its SSL key, because the key is now stored in a hardware security module — a tamper-resistant device that provides a secure enclave for storing keys and performing sensitive functions, like encryption and decryption. Lavabit generates a long passphrase blindly so the company doesn't know what it is; Lavabit then inserts the key into the device and destroys the passphrase.

"Once it's in there we cannot pull that SSL key back out," says Sean, a Lavabit developer who asked to be identified only by his first name. (Many of Lavabit's coders and engineers are volunteers who work for employers who might not like them helping build a system that thwarts government surveillance.)

If anyone does try to extract the key, it will trigger a mechanism that causes the key to self-destruct.

The hardware security module is a temporary solution, however, until end-to-end encryption is available, which will encrypt email on the user's device and make the SSL encryption less critical.

The site is for Lavabit is active, and for those who want to subscribe, the price currently ranges from $15 to $30 annually depending on storage limits. And they accept bitcoins!

Reason TV has previously interviewed Levison about the importance of encryption in protecting liberty and privacy (and warnings about those who simply use vague encryption and security claims for marketing purposes). Watch below:

23 Jan 19:59

Louisiana Police Chief: Resisting Arrest is Now a Hate Crime Under State Law

by C.J. Ciaramella

A Louisiana police chief says the state's new "blue lives matter" law, which makes it a hate crime to target a police officer, extends to simply resisting arrest.

The law was enacted last year as part of a surge of similar legislation introduced around the country following several high-profile ambushes and deadly attacks against police officers, including a Baton Rouge shooting that left three police dead. While many states enhanced the penalties for assaulting police officers, Louisiana became the first state in the U.S. to make police a protected class under hate-crime laws when the governor signed the legislation into law in May.

A New Orleans man was the first person to be charged under the new law last September for allegedly shouting racial and sexist slurs at police. But now, at least one local police chief thinks those protections extend even further.

Louisiana's KATC reports:

St. Martinville Police Chief Calder Hebert hopes the law will not only save lives, but make offenders think twice before resisting arrest.

"We don't need the general public being murdered for no reason and we don't need officers being murdered for no reason. We all need to just work together," said Hebert.

Hebert is very familiar with the new hate crime law, having already enforced it since it took effect in August.

"Resisting an officer or battery of a police officer was just that charge, simply. But now, Governor Edwards, in the legislation, made it a hate crime now," said Hebert.

Under the new law, Hebert says any offender who resists, or gets physical, with an officer can be charged with a felony hate crime.

Those convicted of felony hate crime in Louisiana face a fine of up to $5,000 or a five-year prison sentence, while a hate-crime charge tacked onto a misdemeanor is punishable by a $500 fine or six months in jail.

It's notoriously easy to be charged with resisting arrest, so much so that police departments across the country often consider a large number of resisting arrest charges as a potential red flag for officer misconduct. For example, a WNYC investigation found that just 5 percent of NYPD officers accounted for 40 percent of the 51,503 resisting arrest charges filed between 2009 and 2014. Several of those officers had a history of excessive force complaints and civil rights lawsuits being filed against them.

Of course, there is the question of how a prosecutor could prove that a person resisting arrest was doing so specifically because he or she hated the police. It seems doubtful that widespread application of Hebert's, shall we say, novel legal theory would survive any sort of scrutiny. But then, that would seem to be an underlying problem with the whole notion of extending hate-crime protections to a profession. That's one of the reasons the Anti-Defamation League and other groups that generally support hate crime laws opposed the bill when it was introduced.

Here's what the ADL said when the bill was sailing through the Louisiana legislature:

ADL strongly believes that the list of personal characteristics included in hate crimes laws should remain limited to immutable characteristics, those qualities that can or should not be changed. Working in a profession is not a personal characteristic, and it is not immutable. As a society, we make great efforts to help protect law enforcement and ensure they receive justice. Additionally, ADL is concerned that expanding the characteristics included in bias crime laws may open the door to a myriad of other categories to be added and simultaneously dilute current hate crimes legislation. This bill confuses the purpose of the Hate Crimes Act and weakens its impact by adding more categories of people, who are better protected under other laws.

Or, for a slightly more critical take on hate crime enhancements in general, here's my colleague Scott Shackford, writing about the New Orleans man charged with hate crimes for shouting slurs at the police:

As somebody who has read many, many, many reports of anti-gay assaults and violence over the years, I just want to point out that while it probably looks clear to everybody outside the police that this wasn't a hate crime (again, regardless of a position on hate crime laws), what do people consider when evaluating the credibility of hate crime claims against other minorities? Things like whether the person assaulting a gay person or other minority shouted bigoted slurs, just like Delatoba did here. That is one of the factors used to decide that a crime is motivated by hate, and many supporters of hate crime laws get very, very upset when police don't immediately accept that hate speech as evidence that a hate crime occurred. But since we don't have the ability to read minds, what hate crime enhancements often actually do is add additional punishment based on what people say or express while committing a crime.

23 Jan 17:33

Coherence of Biochemical Oscillations is Bounded by Driving Force and Network Topology. (arXiv:1701.05848v2 [cond-mat.stat-mech] UPDATED)

by Andre C Barato, Udo Seifert

Biochemical oscillations are prevalent in living organisms. Systems with a small number of constituents cannot sustain coherent oscillations for an indefinite time because of fluctuations in the period of oscillation. We show that the number of coherent oscillations that quantifies the precision of the oscillator is universally bounded by the thermodynamic force that drives the system out of equilibrium and by the topology of the underlying biochemical network of states. Our results are valid for arbitrary Markov processes, which are commonly used to model biochemical reactions. We apply our results to a model for a single KaiC protein and to an activator-inhibitor model that consists of several molecules. From a mathematical perspective, based on strong numerical evidence, we conjecture a universal constraint relating the imaginary and real parts of the first non-trivial eigenvalue of a stochastic matrix.

22 Jan 15:57

Biochemical Machines for the Interconversion of Mutual Information and Work

by Thomas McGrath, Nick S. Jones, Pieter Rein ten Wolde, and Thomas E. Ouldridge

Author(s): Thomas McGrath, Nick S. Jones, Pieter Rein ten Wolde, and Thomas E. Ouldridge

An enzyme in a chemical bath can act as an autonomous biochemical device that exploits information to do work.


[Phys. Rev. Lett. 118, 028101] Published Tue Jan 10, 2017

20 Jan 17:12

Solving Nonlinearly Separable Classifications in a Single-Layer Neural Network

by Nolan Conaway
Neural Computation, Volume 29, Issue 3, Page 861-866, March 2017.
15 Jan 02:08

Solar Irradiance Measurements

by John Baez

guest post by Nadja Kutz

This blog post is based on a thread in the Azimuth Forum.

The current theories about the Sun’s life-time indicate that the Sun will turn into a red giant in about 5 billion years. How and when this process is going to be destructive to the Earth is still debated. Apparently, according to more or less current theories, there has been a quasilinear increase in luminosity. On page 3 of

• K.-P. Schröder and Robert Connon Smith, Distant future of the Sun and Earth revisited, 2008.

we read:

The present Sun is increasing its average luminosity at a rate of 1% in every 110 million years, or 10% over the next billion years.

Unfortunately I feel a bit doubtful about this, in particular after I looked at some irradiation measurements. But let’s recap a bit.

In the Azimuth Forum I asked for information about solar irradiance measurements . Why I was originally interested in how bright the Sun is shining is a longer story, which includes discussions about the global warming potential of methane. For this post I prefer to omit this lengthy historical survey about my original motivations (maybe I’ll come back to this later). Meanwhile there is an also a newer reason why I am interested in solar irradiance measurements, which I want to talk about here.

Strictly speaking I was not only interested in knowing more about how bright the sun is shining, but how bright each of its ‘components’ is shining. That is, I wanted to see spectrally resolved solar irradiance measurements—and in particular, measurements in the range between the wavelengths of roughly 650 and 950 nanometers.

This led me to the the Sorce mission, which is a NASA sponsored satellite mission, whose website is located at the University of Colorado. The website very nicely provides an interactive interface including a fairly clear and intuitive LISIRD interactive app with which the spectral measurements of the Sun can be studied.

As a side remark I should mention that this NASA mission belongs to the NASA Earth Science mission, which is currently threatened to be scrapped.

By using this app, I found in the 650–950 nanometer range a very strange rise in radiation between 2003 and 2016, which happened mainly in the last 2-3 years. You can see this rise here (click to enlarge):

verlauf774-51linie
spectral line 774.5nm from day 132 to 5073, day 132 starting Jan 24 in 2003, day 5073 is end of 2016

Now, fluctuations within certain spectral ranges within the Sun’s spectrum are not news. Here, however, it looked as if a rather stable range suddenly started to change rather “dramatically”.

I put the word “dramatically” in quotes for a couple of reasons.

Spectral measurements are complicated and prone to measurement errors. Subtle issues of dirty lenses and the like are already enough to suggest that this is no easy feat, so that this strange rise might easily be due to a measurement failure. Moreover, as I said, it looked as this was a fairly stable range over the course of ten years. But maybe this new rise in irradiation is part of the 11 years solar cycle, i.e., a common phenomenon. In addition, although the rise looks big, it may overall still be rather subtle.

So: how subtle or non-subtle is it then?

In order to assess that, I made a quick estimate (see the Forum discussion) and found that if all the additional radiation would reach the ground (which of course it doesn’t due to absorption), then on 1000 square meters you could easily power a lawn mower with that subtle change! I.e., my estimate was 1200 watts for that patch of lawn. Whoa!

That was disconcerting enough to download the data and linearly interpolate it and calculate the power of that change. I wrote a program in Javascript to do that. The computer calculations revealed an answer of 1000 watts, i.e., my estimate was fairly close. Whoa again!

How does this translate to overall changes in solar irradiance? Some increase had already been noticed. NASA wrote 2003 on its webpage:

Although the inferred increase of solar irradiance in 24 years, about 0.1 percent, is not enough to cause notable climate change, the trend would be important if maintained for a century or more.

That was 13 years ago.

I now used my program to calculate the irradiance for one day in 2016 between the wavelengths of 180.5 nm and 1797.62 nm, a quite big part of the solar spectrum, and got the value 627 W/m2. I computed the difference between this and one day in 2003, approximately one solar cycle earlier. I got 0.61 W/m2, which is 0.1% in 13 years, rather then 24 years. Of course this is not an average value, and not really well adjusted to the sun cycle, and fluctuations play a big role in some parts of the spectrum, but well—this might indicate that the overall rate of rise in solar radiation may have doubled. Likewise concerning the question of the sun’s luminosity: for assessing luminosity one would need to take the concrete satellite-earth orbit at the day of measurement into account, as the distance to the sun varies. But still, on a first glance this all appears disconcerting.

Given that this spectral range has for example an overlap with the absorption of water (clouds!), this should at least be discussed.

See how the spectrum splits into a purple and dark red line in the lower circle? (Click to enlarge.)

bergbildtag132tag5073at300kreis
Difference in spectrum between day 132 and 5073

The upper circle displays another rise, which is discussed in the forum.

So concluding, all this looks as if this needs to be monitored a bit more closely. It is important to see whether these rises in irradiance are also displayed in other measurements, so I asked in the Azimuth Forum, but so far have gotten no answer.

The Russian Wikipedia site about solar irradiance unfortunately contains no links to Russian satellite missions (if I haven’t overlooked something), and there exists no Chinese or Indian Wikipedia about solar irradiance. I also couldn’t find any publicly accessible spectral irradiance measurements on the ESA website (although they have some satellites out there). In December I wrote an email to the head of the section solar radiometry of the World Radiation Center (WRC) Wolfgang Finsterle, but I’ve had no answer yet.

In short: if you know about publicly available solar spectral irradiance measurements other than the LISIRD ones, then please let me know.


11 Jan 21:13

Emergent stochastic oscillations and signal detection in tree networks of excitable elements. (arXiv:1701.01693v1 [physics.bio-ph])

by Justus Kromer, Ali Khaledi-Nasab, Lutz Schimansky-Geier, Alexander B. Neiman

We study the stochastic dynamics of strongly-coupled excitable elements on a tree network. The peripheral nodes receive independent random inputs which may induce large spiking events propagating through the branches of the tree and leading to global coherent oscillations in the network. This scenario may be relevant to action potential generation in certain sensory neurons, which possess myelinated distal dendritic tree-like arbors with excitable nodes of Ranvier at peripheral and branching nodes and exhibit noisy periodic sequences of action potentials.

We focus on the spiking statistics of the central node, which fires in response to a noisy input at peripheral nodes. We show that, in the strong coupling regime, relevant to myelinated dendritic trees, the spike train statistics can be predicted from an isolated excitable element with rescaled parameters according to the network topology. Furthermore, we show that by varying the network topology the spike train statistics of the central node can be tuned to have a certain firing rate and variability, or to allow for an optimal discrimination of inputs applied at the peripheral nodes.

05 Jan 03:52

Explorability and the origin of Network Sparsity in Living Systems. (arXiv:1701.00542v2 [physics.soc-ph] UPDATED)

by Daniel M. Busiello, Samir Suweis, Jorge Hidalgo, Amos Maritan

The increasing volume of ecologically and biologically relevant data has revealed a wide collection of emergent patterns in living systems. Analyzing different datasets, ranging from metabolic gene-regulatory to species interaction networks, we find that these networks are sparse, i.e. the percentage of the active interactions scales inversely proportional to the system size. This puzzling characteristic has been neither yet considered nor explained. Herein, we introduce the new concept of explorability, a measure of the ability of the system to adapt to newly intervening changes. We show that sparsity is an emergent property resulting from a variational principle aiming at the optimization of both explorability and dynamical robustness, the capacity of the system to remain stable after perturbations of the underlying dynamics. Networks with higher connectivities lead to an incremental difficulty to find better values for both the explorability and dynamical robustness, associated with the fine-tuning of the newly added interactions. A relevant characteristic of our solution is its scale invariance, that is, it remains optimal when several communities are assembled togheter. Connectivity is also a key ingredient determining ecosystem stability and our proposed solution contributes to solving May's celebrated complexity-stability paradox.

04 Jan 02:52

Ownership of an artificial limb induced by electrical brain stimulation [Neuroscience]

by Kelly L. Collins, Arvid Guterstam, Jeneva Cronin, Jared D. Olson, H. Henrik Ehrsson, Jeffrey G. Ojemann
Replacing the function of a missing or paralyzed limb with a prosthetic device that acts and feels like one’s own limb is a major goal in applied neuroscience. Recent studies in nonhuman primates have shown that motor control and sensory feedback can be achieved by connecting sensors in a robotic...
02 Jan 04:22

Redundancy and synergy in dual decompositions of mutual information gain and information loss. (arXiv:1612.09522v1 [physics.data-an])

by Daniel Chicharro, Stefano Panzeri
Nosimpler

Uh oh. Theoretical neuroscientists are getting into lattices.

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of information gain lattices, which allows separating the information that a set of variables contains about another into components interpretable as the unique information of one variable, or redundant and synergy components. In this work we extend the framework of Williams and Beer (2010) focusing on the lattices that underpin the decomposition. We generalize the type of constructible lattices and examine the relations between the terms in different lattices, for example relating bivariate and trivariate decompositions. We point out that, in information gain lattices, redundancy components are invariant across decompositions, but unique and synergy components are decomposition-dependent. Exploiting the connection between different lattices we propose a procedure to construct, in the general multivariate case, information decompositions from measures of synergy or unique information. We introduce an alternative type of mutual information decompositions based on information loss lattices, with the role and invariance properties of redundancy and synergy components exchanged with respect to gain lattices. We study the correspondence between information gain and information loss lattices and we define dual decompositions that allow overcoming the intrinsic asymmetry between invariant and decomposition-dependent components, which hinders the consistent joint characterization of synergy and redundancy.

02 Jan 02:43

NSA Axes Math Grants

by leinster
MathML-enabled post (click for more details).

Old news, but interesting: the US National Security Agency (NSA) announced some months ago that it was suspending funding to its Mathematical Sciences Program. The announcement begins by phrasing it as a temporary suspension—

…[we] will be unable to fund any new proposals during FY2017 (i.e. Oct. 1, 2016–Sept. 30, 2017)

—but by the end, sounds resigned to a more permanent fate:

We thank the mathematics community and especially the American Mathematical Society for its interest and support over the years.

We’ve discussed this grant programme before on this blog.

The NSA is said to be the largest employer of mathematicians in the world, and has been under political pressure for obvious reasons over the last few years, so it’s interesting that it cut this programme. Its British equivalent, GCHQ, is doing the opposite, expanding its mathematics grants aggressively. But still, GCHQ consistently refuses to engage in any kind of adult, evidence-based discussion with the mathematical community on what the effect of its actions on society might actually be.

28 Dec 19:20

Looks like something from another planet...

by Minnesotastan

Because it is:
"... Groups of dark brown streaks have been photographed by the Mars Reconnaissance Orbiter on melting pinkish sand dunes covered with light frost..."
26 Dec 02:55

[Editors' Choice] Tracking extracellular space in the brain

by Peter Stern
Author: Peter Stern
26 Dec 00:06

Creepy Tech That Will Turbocharge Fake News

by Cathy O'Neil, mathbabe

My buddy Josh Vekhter is visiting from his Ph.D. program in computer science and told me about a couple of incredibly creepy technological advances that will soon make our previous experience of fake news seem quaint.

First, there’s a way to edit someone’s speech:

Next, there’s a way to edit a video to insert whatever facial expression you want (I blame Pixar on this one):

Put those two technologies together and you’ve got Trump and Putin having an entirely fictitious but believable conversation on video.


21 Dec 17:47

Marble runs

by Minnesotastan

Reposted from 2016 to add this awesome marble run choreographed to music:


With a hat tip to Miss C at Neatorama.

Reposted from January to add this even better one:



 There is an explanation of some of the mechanics at Neatorama.
18 Dec 01:19

Field Notes on the Behaviour of a Large Assemblage of Ecologists

by leinster
MathML-enabled post (click for more details).

I’ve just come back from the annual conference of the British Ecological Society in Liverpool. For several years I’ve had a side-interest in ecology, but I’d never spent time with a really large group of ecologists before, and it taught me some things. Here goes:

MathML-enabled post (click for more details).
  1. Size and scale. Michael Reed memorably observed that the American Mathematical Society is about the same size as the American Society for Nephrology, “and that’s just the kidney”. Simply put: not many people care about mathematics.

    The British Ecological Society (BES) meeting had 1200 participants, which is about ten times bigger than the annual international category theory meeting, and still only a fraction of the size of the conference run by the Ecological Society of America. You may reply that the US Joint Mathematics Meetings attract about 7000 participants; but as Reed pointed out (under the heading “Most of Science is Biology”), the Society for Neuroscience gets about 30,000. Even at the BES meeting in our small country, there were nearly 600 talks, 70 special sessions, and 220 posters. In the parallel sessions, you had a choice of 12 talks to go to at any given moment in time.

  2. Concision. Almost all talks were 12 minutes, with 3 minutes for questions. You cannot, of course, say much in that time.

    With so many people attending and wanting to speak, it’s understandable that the culture has evolved this way. And I have to say, it’s very nice that if you choose to attend a talk and swiftly discover that you chose badly, you’ve only lost 15 minutes.

    But there are many critiques of enforced brevity, including from some very distinguished academics. It’s traditionally held that the most prestigious journals in all of science are Nature and Science, and in both cases the standard length of an article is only about three pages. The style of such papers is ludicrously condensed, and from my outsider’s point of view I gather that there’s something of a backlash against Nature and Science, with less constipated publications gaining ground in people’s mental ranking systems. When science is condensed too much, it takes on the character of a sales pitch.

    This is part of a wider phenomenon of destructive competition for attention. For instance, almost all interviews on TV news programmes are under ten minutes, and most are under five, with much of that taken up by the interviewer talking. The very design favours sloganeering and excludes all points that are too novel or controversial to explain in a couple of sentences. (The link is to a video of Noam Chomsky, who makes this point very effectively.) Not all arguments can be expressed to a general audience in a few minutes, as every mathematician knows.

  3. The pleasure of introductions. Many ecologists study one particular natural system, and often the first few minutes of their talks are a delight. You learn something new and amazing about fungi or beavers or the weird relationships between beetles and ants. Did you know that orangutans spend 80% of the day resting in their nests? Or that if you give a young orangutan some branches, he or she will instinctively start to weave them together in a nest-like fashion, as an innate urge that exists whether or not they’ve been taught how to do it? I didn’t.

    Orangutan resting in nest

  4. Interdisciplinarity. I’ve written before about the amazing interdisciplinarity of biologists. It seems to be ingrained in the intellectual culture that you need people who know stuff you don’t know, obviously! And that culture just isn’t present within mathematics, at least not to anything like the same extent.

    For instance, this afternoon I went to a talk about the diversity of microbiomes. The speaker pointed out that for what she was doing, you needed expertise in biology, chemistry, and informatics. She was unusual in actually spelling it out and spending time talking about it. Most of the time, speakers moved seamlessly from ecology to statistics to computation (typically involving processing of large amounts of DNA sequence data), without making a big deal of it.

    But there’s a byproduct of interdisciplinarity that troubles my mathematical soul:

  5. The off-the-shelf culture. Some of the speakers bowled me over with their energy, vision, tenacity, and positive outlook. But no one’s superhuman, so it’s inevitable that if your work involves serious aspects of multiple disciplines, you’re probably not going to look into everything profoundly. Or more bluntly: if you need some technique from subject X and you know nothing about subject X, you’re probably just going to use whatever technique everybody else uses.

    The ultimate reason why I ended up at this conference is that I’m interested in the quantification of biological diversity. So, much of the time I chose to go to talks that had the word “diversity” in the title, just to see what measure of diversity was used by actual practising ecologists.

    It wasn’t very surprising that almost all the time, as far as I could tell, there was no apparent examination of what the measures actually measured. They simply used whatever measure was predominant in the field.

    Now, I need to temper that with the reminder that the talks are ultra-short, with no time for subtleties. But still, when I asked one speaker why he chose the measure that he chose, the answer was that it’s simply what everyone else uses. And I can’t really point a finger of blame. He wasn’t a mathematician, any more than I’m an ecologist.

  6. The lack of theory. If this conference was representative of ecology, the large majority of ecologists study some specific system. By “system” I mean something like European hedgerow ecology, or Andean fungal ecology, or the impact of heatwaves on certain types of seaweed.

    This is, let me be clear, not a bad thing. Orders of magnitude more people care about seaweed than nn-categories. But still, I was surprised by the sheer niche-ness of general theory in the context of ecology as a whole. A group of us are working on a system of diversity measures that are general in a mathematician’s sense; they effortlessly take in such examples as human demography, tropical forestry, epidemiology, and resistance to antibiotics. This didn’t seem like that big a deal to me previously — it’s just the bog-standard generality of mathematics. But after this week, I can see that from many ecologists’ eyes, it may seem insanely general.

    Actually, the most big-picture talks I saw were very unmathematical. They were, in fact, about policy and the future of humanity. I’m not being flippant:

  7. Unabashed politics. Mathematics is about an idealized world of imagination. Ecology is about our one and only natural world — one that we happen to be altering at an absolutely unprecedented rate. Words like “Brexit” and “Trump” came up dozens of times in the conference talks, and not in a tittery jocular way. The real decisions of people with real political power will have real, irreversible effect in the real world.

    Once again, this brought home to me that mathematics is not like (the rest of) science.

    It’s not just that we don’t have labs or experiments or hypothesis testing (at least, not in the same way). It’s that we can do mathematics in complete isolation from the realities of the world that human beings have made.

    We don’t have to think about deforestation or international greenhouse gas treaties or even local fishery byelaws. We might worry about the applications of mathematics — parasitic investment banks or deadly weapons or governments surveilling and controlling their citizens — but we can actually do mathematics in lamb-like innocence.

    On the other hand, for large parts of ecology, the political reality is an integral consideration.

    I saw some excellent talks, especially from Georgina Mace and Hugh Possingham, on policy and influencing governments. Possingham was talking about saving Portugal-sized areas of Australia from industrial destruction. (His advice for scientists engaging with governments: “Turn up. Have purpose. Maintain autonomy.”) Mace spoke on what are quite possibly the biggest threats to the entire planet: climate change, floods and heatwaves, population growth, and fragmentation and loss of habitats.

    It’s inspiring to see senior scientists being unafraid to repeat basic truths to those in power, to gather the available evidence and make broad estimates with much less than 100% of the data that one might wish for, in order to push changes that will actually improve human and other animal lives.

16 Dec 01:44

Saving Climate Data (Part 1)

by John Baez

guerrilla-2

I try to stay out of politics on this website. This post is not mainly about politics. It’s a call to action. We’re trying to do something rather simple and clearly worthwhile. We’re trying to create backups of US government climate data.

The background is, of course, political. Many signs point to a dramatic change in US climate policy:

• Oliver Milman, Trump’s transition: sceptics guide every agency dealing with climate change, The Guardian, 12 December 2016.

So, scientists are now backing up large amounts of climate data, just in case the Trump administration tries to delete it after he takes office on January 20th:

• Brady Dennis, Scientists are frantically copying U.S. climate data, fearing it might vanish under Trump, Washington Post, 13 December 2016.

Of course saving the data publicly available on US government sites is not nearly as good as keeping climate programs fully funded! New data is coming in all the time from satellites and other sources. We need it—and we need the experts who understand it.

Also, it’s possible that the Trump administration won’t go so far as trying to delete big climate science databases. Still, I think it can’t be a bad thing to have backups. Or as my mother always said: better safe than sorry!

Quoting the Washington Post article:

Alarmed that decades of crucial climate measurements could vanish under a hostile Trump administration, scientists have begun a feverish attempt to copy reams of government data onto independent servers in hopes of safeguarding it from any political interference.

The efforts include a “guerrilla archiving” event in Toronto, where experts will copy irreplaceable public data, meetings at the University of Pennsylvania focused on how to download as much federal data as possible in the coming weeks, and a collaboration of scientists and database experts who are compiling an online site to harbor scientific information.

“Something that seemed a little paranoid to me before all of a sudden seems potentially realistic, or at least something you’d want to hedge against,” said Nick Santos, an environmental researcher at the University of California at Davis, who over the weekend began copying government climate data onto a nongovernment server, where it will remain available to the public. “Doing this can only be a good thing. Hopefully they leave everything in place. But if not, we’re planning for that.”

[…]

“What are the most important .gov climate assets?” Eric Holthaus, a meteorologist and self-proclaimed “climate hawk,” tweeted from his Arizona home Saturday evening. “Scientists: Do you have a US .gov climate database that you don’t want to see disappear?”

Within hours, responses flooded in from around the country. Scientists added links to dozens of government databases to a Google spreadsheet. Investors offered to help fund efforts to copy and safeguard key climate data. Lawyers offered pro bono legal help. Database experts offered to help organize mountains of data and to house it with free server space. In California, Santos began building an online repository to “make sure these data sets remain freely and broadly accessible.”

In Philadelphia, researchers at the University of Pennsylvania, along with members of groups such as Open Data Philly and the software company Azavea, have been meeting to figure out ways to harvest and store important data sets.

At the University of Toronto this weekend, researchers are holding what they call a “guerrilla archiving” event to catalogue key federal environmental data ahead of Trump’s inauguration. The event “is focused on preserving information and data from the Environmental Protection Agency, which has programs and data at high risk of being removed from online public access or even deleted,” the organizers said. “This includes climate change, water, air, toxics programs.”

The event is part of a broader effort to help San Francisco-based Internet Archive with its End of Term 2016 project, an effort by university, government and nonprofit officials to find and archive valuable pages on federal websites. The project has existed through several presidential transitions.

I hope that small “guerilla archiving” efforts will be dwarfed by more systematic work, because it’s crucial that databases be copied along with all relevant metadata—and some sort of cryptographic certificate of authenticity, if possible. However, getting lots of people involved is bound to be a good thing, politically speaking.

If you have good computer skills, good understanding of databases, or lots of storage space, please get involved. Efforts are being coordinated by Barbara Wiggin and others at the Data Refuge Project:

• PPEHLab (Penn Program in the Environmental Humanities), DataRefuge.

You can contact them at DataRefuge@ppehlab.org. Nick Santos is also involved, and if you want to get “more plugged into the project” you can contact him here. They are trying to build a climate database mirror website here:

Climate Mirror.

At the help form on this website you can nominate a dataset for rescue, claim a dataset to rescue, let them know about a data rescue event, or help in some other way (which you must specify).

PPEHLab and Penn Libraries are organizing a data rescue event this Thursday:

• PPEHLab, DataRefuge meeting, 14 December 2016.

At the American Geophysical Union meeting in San Francisco, where more than 20,000 earth and climate scientists gather from around the world, there was a public demonstration today starting at 1:30 PST:

Rally to stand up for science, 13 December 2016.

And the “guerilla archiving” hackathon in Toronto is this Saturday—see below. If you know people with good computer skills in Toronto, get them to check it out!

To follow progress, also read Eric Holthaus’s tweets and replies here:

Eric Holthaus.

Guerrilla archiving in Toronto

Here are details on this:

Guerrilla Archiving Hackathon

Date: 10am-4pm, December 17, 2016

Location: Bissell Building, 4th Floor, 140 St. George St. University of Toronto

RSVP and up-to-date information: Guerilla archiving: saving environmental data from Trump.

Bring: laptops, power bars, and snacks. Coffee and pizza provided.

This event collaborates with the Internet Archive’s End of Term 2016 project, which seeks to archive the federal online pages and data that are in danger of disappearing during the Trump administration. Our event is focused on preserving information and data from the Environmental Protection Agency, which has programs and data at high risk of being removed from online public access or even deleted. This includes climate change, water, air, toxics programs. This project is urgent because the Trump transition team has identified the EPA and other environmental programs as priorities for the chopping block.

The Internet Archive is a San Francisco-based nonprofit digital library which aims at preserving and making universally accessible knowledge. Its End of Term web archive captures and saves U.S. Government websites that are at risk of changing or disappearing altogether during government transitions. The Internet Archive has asked volunteers to help select and organize information that will be preserved before the Trump transition.

End of Term web archive: http://eotarchive.cdlib.org/2016.html

New York Times article: “Harvesting Government History, One Web Page at a Time

Activities:

Identifying endangered programs and data

Seeding the End of Term webcrawler with priority URLs

Identifying and mapping the location of inaccessible environmental databases

Hacking scripts to make accessible to the webcrawler hard to reach databases.

Building a toolkit so that other groups can hold similar events

Skills needed: We need all kinds of people — and that means you!

People who can locate relevant webpages for the Internet Archive’s webcrawler

People who can identify data targeted for deletion by the Trump transition team and the organizations they work with

People with knowledge of government websites and information, including the EPA

People with library and archive skills

People who are good at navigating databases

People interested in mapping where inaccessible data is located at the EPA

Hackers to figure out how to extract data and URLs from databases (in a way that Internet Archive can use)

People with good organization and communication skills

People interested in creating a toolkit for reproducing similar events

Contacts: michelle.murphy@utoronto.ca, p.keilty@utoronto.ca


13 Dec 15:43

The Revolution Has Begun: Beyond Meat

by Alex Tabarrok

Animal rights will be the big social revolution of the 21st century. Most people have a vague feeling that factory farms aren’t quite ethical. But few people are willing to give up meat so such feelings are suppressed because acknowledging them would only make one feel guilty not just. Once the costs of giving up meat fall, however, vegetarianism will spread like a prairie wildfire changing eating habits, the use of farm land, and the science and economics of climate change.

Lab grown or cultured meat is improving but so is the science of veggie burgers. Beyond Meat has sold a very successful frozen “chicken” strip since 2013 and their non-frozen burger patties are just now seeing widespread distribution in the meat aisle at Whole Foods. Beyond Meat extracts protein from peas and then combines it with other vegetable elements under heating, cooling and pressure to realign the proteins in a way that simulates the architecture of beef.

I picked up at two-pack on the weekend. Beyond Meat burgers look and cook like meat. But what about the taste?

beyondbeef1

The taste is excellent. The burger has a slightly smokey taste, not exactly like beef but like meat. If you had never tasted a buffalo burger before and I told you that this was a buffalo burger you would have no reason to doubt me. A little sauce and salt and pepper and this is a very good-tasting burger not a sacrifice for morality.

The price is currently more than beef, $6 for two patties but that’s Whole Foods expensive not out of reach expensive. I will buy more.

The revolution has begun.

beyondbeef2

The second picture is the BuzzFeed version. My burger wasn’t quite so artfully arranged but was still delicious and I attest to the overall accuracy.

Addendum: 20 g protein: 6 g carb: 22g fat (5g saturated).

The post The Revolution Has Begun: Beyond Meat appeared first on Marginal REVOLUTION.

12 Dec 20:38

[In Depth] Energy pulses reveal possible new state of memory

by Jessica Boddy
Memory researchers have shone light into a cognitive limbo. A new memory—the name of someone you've just met, for example—is held for seconds in so-called working memory, as your brain's neurons continue to fire. If the person is important to you, the name will over a few days enter your long-term memory, preserved by permanently altered neural connections. But where does it go during the in-between hours, when it has left your standard working memory and is not yet embedded in long-term memory? To figure this out, a research team resurrects memories from this limbo. Their observations point to a new form of working memory, which they dub prioritized long-term memory, that exists without elevated neural activity. Consistent with other recent work, the study suggests that information can somehow be held among the synapses that connect neurons, even after conventional working memory has faded. This new memory state could have a range of practical implications, from helping college students learn more efficiently to assisting people with memory-related neurological conditions such as amnesia, epilepsy, and schizophrenia. Author: Jessica Boddy
12 Dec 19:53

It Was I

It me, your father.
12 Dec 17:04

Nasal Respiration Entrains Human Limbic Oscillations and Modulates Cognitive Function

by Zelano, C., Jiang, H., Zhou, G., Arora, N., Schuele, S., Rosenow, J., Gottfried, J. A.

The need to breathe links the mammalian olfactory system inextricably to the respiratory rhythms that draw air through the nose. In rodents and other small animals, slow oscillations of local field potential activity are driven at the rate of breathing (~2–12 Hz) in olfactory bulb and cortex, and faster oscillatory bursts are coupled to specific phases of the respiratory cycle. These dynamic rhythms are thought to regulate cortical excitability and coordinate network interactions, helping to shape olfactory coding, memory, and behavior. However, while respiratory oscillations are a ubiquitous hallmark of olfactory system function in animals, direct evidence for such patterns is lacking in humans. In this study, we acquired intracranial EEG data from rare patients (Ps) with medically refractory epilepsy, enabling us to test the hypothesis that cortical oscillatory activity would be entrained to the human respiratory cycle, albeit at the much slower rhythm of ~0.16–0.33 Hz. Our results reveal that natural breathing synchronizes electrical activity in human piriform (olfactory) cortex, as well as in limbic-related brain areas, including amygdala and hippocampus. Notably, oscillatory power peaked during inspiration and dissipated when breathing was diverted from nose to mouth. Parallel behavioral experiments showed that breathing phase enhances fear discrimination and memory retrieval. Our findings provide a unique framework for understanding the pivotal role of nasal breathing in coordinating neuronal oscillations to support stimulus processing and behavior.

SIGNIFICANCE STATEMENT Animal studies have long shown that olfactory oscillatory activity emerges in line with the natural rhythm of breathing, even in the absence of an odor stimulus. Whether the breathing cycle induces cortical oscillations in the human brain is poorly understood. In this study, we collected intracranial EEG data from rare patients with medically intractable epilepsy, and found evidence for respiratory entrainment of local field potential activity in human piriform cortex, amygdala, and hippocampus. These effects diminished when breathing was diverted to the mouth, highlighting the importance of nasal airflow for generating respiratory oscillations. Finally, behavioral data in healthy subjects suggest that breathing phase systematically influences cognitive tasks related to amygdala and hippocampal functions.

03 Dec 17:03

Protocols

by noreply@blogger.com (Atrios)
While any particular move could be a boneheaded one which could set us on the path to Armageddon, I actually think the general concept of Trump blowing up our diplomatic protocols is fine. No I am not here to praise Trump. Likely he will set us on the path to Armageddon. Still I've long seen the way various long-established "rules" of doing thing seem to serve little purpose other than to make sure that the deep state maintains its control of the executive.

Some of our diplomatic niceties are stupid, though I do understand they exist for good reasons, even if they exist for bad ones, as I suspect, also, too.