User blogs

Tag search results for: "research design"
Admin

Quantum network will provide billions of people secure and fast connection. Researchers have yet to incorporate into it nodes of different types to make a bright quantum future the case.


And here we have a breakthrough, one in building a hybrid quantum system. Researchers managed to pass a qubit, a unit of quantum information, from one node based on cooled gas to another based on a rare-earth doped crystal. This is a sure sign of progress, so let's see how the work can be advanced further to consumer-oriented devices.


In the study researchers report they also changed the wavelength of a carrier of a qubit, i.e. a photon after generating it to fit into a conventional optical fiber used to link the two nodes. Upon receiving the wavelength was changed second time to match that of the receiving crystal itself.


Finally the team reports successful encoding.


"We demonstrate that quantum correlations between a photon and a single collective spin excitation in the cold atomic ensemble can be transferred to the solid-state system.


"We also show that single-photon time-bin qubits generated in the cold atomic ensemble can be converted, stored and retrieved from the crystal with a conditional qubit fidelity of more than 85 per cent.


"Our results open up the prospect of optically connecting quantum nodes with different capabilities and represent an important step towards the realization of large-scale hybrid quantum networks.

"Photonic quantum state transfer between a cold atomic gas and a crystal" read paper


What is the next step? As we are at ReptileCoin community going to provide our users with advanced quantum networks we need more research in this field, especially we aim to find ways of reducing costs for mass user. Therefore work to be done includes surveying and comparing different types of quantum nodes available. Looking at them in the perspective of being used as components of a network to come is to be in the focus of research we need and are going to support. It is being a part of a bigger network that might have impact on weighing pros and cons of different physical platforms representing nodes.


Admin
How we can have our pets speak - let's look at advances in neuroscience that can highly likely lead us to a position of being able to develop speech in any creature having brains.

What happens in the brain when we hear something? Researchers took a patient who plays a piano well and recorded his brain activity when playing.

Then they turned the sound off and the same was done once again: this time the sound existed only in the brain. The striking results may suggest a way of developing a certain area in the brain that can compose music. And then a human - or any other creature with brains - can play the music by using mental power only. The sound of the music can be scanned or read from the brain directly - and music can be played out loud, in the form of sound waves and be heard by others.

Another way of advancing this result is to try to develop certain areas of the brain to reflect ideas we usualy express through words, that is speech. And then process the sound generated in the brain in a similar way - convert or translate the activity into sound that can be heard.

It is worth working in this direction to make a device that is able of reading brain activity in a creature and translate it straightaway into speech understandable by others.

Actually we might want to have this kind of reader-translator as we want to please all people keeping whatever pet they keep - dogs, cats, parrots, snakes, lizards, tortoises, even fish!

The goal at that is to make owners of pets even happier by enabling their pets with ability to sing songs, speak and have conversations.

Lets get back to the study which however was done on humans.

First, the participant played two piano pieces on an electronic piano with the sound volume of the digital keyboard on. Second, the participant replayed the same piano pieces, but without auditory feedback, and the participant was asked to imagine hearing the music in his mind. The result is similarities between perception and imagery was found.

The experiment done was a pretty simple one. Of our interest are ways to advance it.

The first thing we need to do is to determine what makes the two conditions different. That is to determine different mechanisms in the brain that are involved in a) generating music that is imaginary, unheard and b) actual hearing the sound in physical form. We are going to support this research and get involved in it many different teams.

The original paper is called "Neural Encoding of Auditory Features during Music Perception and Imagery: Insight into the Brain of a Piano Player". In terms of authors this research task is laid out this way:

"It remains unclear how the human cortex represents spectrotemporal sound features during auditory imagery, and how this representation compares to auditory perception.

This is the first turning point for further research.

The second is to replace recording electrocorticographic signals from electrodes in the brain with more suitable non-invasive technique to move towards developing devices for mass user, ones doing scanning and translation.

And the third turning point in research work to be done. We first focus on music, on auditory information that is reperesented by high and low frequency sounds. In more advanced research we are going to work on linguistic information, on its representation in the brain and ways to scan and translate it to be able to recreate it to be heard.