Rosetta Mission Resolution

Today is the day that the Rosetta spacecraft will end its mission that it has been working on for two years. Its strange to imagine that it has been that long as it seems just a couple of months ago I was in a lecture detailing the engineering of the equipment aboard the vessel. The Rosetta mission was record breaking in many ways, it is the first spacecraft to follow and orbit a comet and the Philae lander is the first human object to have a controlled landing on a comet. This allowed the first surface taken images and first on ground chemical test data to be collected. This isn’t even going into the specifics of the first ultraviolet photography of a comet; first mostly primary solar powered craft near Jupiter’s orbit; and first recorded data of  ice melting on approach to the Sun. With the mission over it will be Rosetta’s fate to crash, albeit slowly, into the comets surface. Of course scientists, never missing an opportunity, have fitted it with even more sensitive cameras than the Philae lander in order to get some final pictures back to Earth before its demise. In a way it will be quite sad to think that it is no longer corkscrewing its path through space, but it is likely to have an permanent resting place on the comet until eventually having a collision with the Sun in a couple of million years.

Stable Single Photon Sources

A single photon source is, unsurprisingly, a light source that gives out this light in the form of a single photon at any given time. Being able to produce an isolated single photon would be very useful and the effect has been shown but there is currently practical single photon emitters that could be used for research purposes. The most common single emitters: Nitrogen vacancy centres (a defect in diamond) and quantum dots (nano scale carbon spheres) are not very reliable and the chance of actually producing a single photon. Outside of a laboratory these producers become even more unreliable and so their use in any practical application is very limited. The best that has been produced so far are called heralded single photon sources. This is because they are in fact dual photon sources that once the two photons have been generated by detecting one we can learn about the other, in this way the first photon heralds the existence of its twin. If perfect, reliable and functional producer of single photons could be found it would need to be integrated into and scaled to fit on various computer chips. Quantum information and various branches of cryptography would vastly benefit from these unique light emitters and with recent breakthroughs in nanofabrication and other production techniques it could become a reality in a couple of years time.

Nuclear And Magnetic Resonance Mix

I have briefly explained magnetic resonance imaging before and so it is advisable to read this previous article first. One of the other common methods of biological scanning is nuclear imaging. Despite gamma rays notorious nature, radioisotopes that give off gamma rays are the safest to ingest as gamma rays tendency to pass through concrete means that it will definitely pass thorough the body and allow an exterior detector to get  a look at where the rays are coming from. MRI allows for good detail and contrast for the different parts of the body while γ-imaging allows for targeted parts of the body to be observed based on manipulating where the gamma source will end up.

By using electromagnetic radiation and a varying magnetic field a radioactive tracer can be polarised and have its spin quantum number given specific values. This means that the gamma radiation given off can be detected and formed into an image not by a radiation camera but by a single radiation detector. The polarisation and spin effect is identical to what occurs in water molecules in MRI and the similarities to nuclear imaging is clear. This new method combines the best parts of both the originals and was able to detect 8.7 μg (8.7×10kg) of Xe-131m in a model glass cell. This means 4 × 1013 atoms had information taken from them, if this was standard MRI and water had to be used it would have been in the region of 1024 water molecules that needed to be imaged. This method opens up new possibilities when it comes to radioactive tracers and the precise imaging of the human body.

Separate Phases In Superfluids

Superconductivity is something most people are willing to accept without question. Why shouldn’t it be possible for electrons to move through a metal without encountering resistance? This sis probably because electricity is quite an abstract thing, we can’t see it and most of the time its presence isn’t noted. Something that makes people uncomfortable, however, is the concept of a superfluid. A superfluid is a liquid which experiences no loss of kinetic energy while flowing and has a viscosity of zero. Even air has a viscosity of 1.81×10P (poise, named after Jean Poiseuille). If you do a single stir in vat of superfluid it will continue to circle forever like the never diminishing current in superconductors.

File:Helium-II-creep.svg
Diagram by Julio Reis

The picture above shows another property of superfluids, they are more determined then their normal counterparts to become level and so will creep over the walls of the central container until their is a uniform level throughout the system.

Helium is the most common element that gets turned into a superfluid. Helium-4, most often, as it forms a superfluid at a greater temperature and has the unique aspect of each individual helium being a boson instead of a fermion. But research performed on helium-3 has showed that there are in fact two different states in the helium-3 superfluid, labelled A and B unsurprisingly. B phase was the generally more stable of the two and is the superfluid that most research has been done on in the past. But when the temperature or pressure starts getting raised to the point where superfluidity might be lost, the helium-3 undergoes a phase change and it is the A type that is more stable in these conditions.

 

Creating A Stable Optical Clock

Time is something that has always fascinated humanity. The ancient Egyptians invented the first sun dial to measure out the day and ancient Greek philosophers pondered what time’s real nature was. Even today it seems we are always inventing a new slightly  more accurate clock that lets us know our accuracy to another femtosecond or two. Although it seems frivolous this research is actually very important. It’s using clocks like these that we can begin to test Einstein’s and Lorentz’s theories about how the universe operates. Being able to tell such accurate time also has practical benefits in communications and astrophysics where as time progresses any discrepancy in a clocks counting can be disastrous.

The new ideas being tested are optical lattice clocks. An optical lattice is not a physical material thing as it might sound, instead it is actually the effect generated by the interference of lasers that creates a nice periodic pattern. Atoms then get trapped at the lowest potentials of this pattern which resembles a crystalline lattice in shape. A clock designed from an optical lattice would be extremely accurate but also portable and could be easily be used high quality research performed in space. The problem is the lasers used in these devices are quite unstable and so work has to be done in order to make sure they can be maintained for a sufficient length of time.Recently a paper has managed to show a new set up with reduced instability in the laser and an acceleration sensitivity of 3 × 10−10/g which may seem very low but it’s ten times better than the previous design. It is now believed that since this clock has to be built in a vacuum maintaining pressure inside this chamber seems to be the next big blockade in the road ahead.

Weekly Roundup 38

Nuclear accidents are pretty rare. There are seven categories for nuclear disasters and there have only been two, Chernobyl and Fukushima, which fall into the highest category. Often this is because nuclear reactors are designed to remain sturdy no matter what happens. Layer after layer of safety protocols and hazard checks are piled on to prevent even the smallest incident from occurring. When nuclear power was first being introduced it was believed that it would make fossil fuels obsolete and that electricity could be given away for free, although this has not happened due to the extreme costs of building the safety precautions into the facilities. Occasionally, however, disasters can happen unrelated to nuclear power plants themselves. This should be told as one of the horror stories of physics: The Goiânia accident. Indeed it reads a lot like a horror story. Brazil, 1987, two men, eager to profiteer break into an abandoned hospital and steal the inside of a machine they believe might have some scrap value. What they had stolen was the inside of a radiography unit which they proceeded to wheel off, take home and dismantle. They continued despite beginning to vomit and one obtained a burn in the same shape as the devices aperture on his hand. After freeing a capsule and breaking it open they find blue glowing powder which they sell on and the story continues with various individuals not understanding that handling blue glowing material of unknown origin is not a good idea. Of course hindsight is 20:20. It is possible that in the future they will look back on the things we did as foolish because of our ignorance. Would any of us recognise nuclear material if we saw it, possibly not.

Either way, until tomorrow, goodnight.

Showing The Universe’s Steady Expansion

The best analogy to that of the expanding universe is a balloon getting blown up with an ant and multiple dots on the balloon’s surface. The ant can be seen as a 2D being, on a 2D universe expanding into a 3D space. The distance between any two spots grows at a faster rate the greater the distance between them. Both of these fact translate up to the real universe’s expansion also. For more complex details on the subject reading Owais Najam’s articles at The Brilliant Cosmos is recommended. The final key comparison to make is that for the spots on the balloon, there is no centre. Each spots moves away from each other spot and none can be considered a focused centre, not just due to relative motion but by an inherent fact about the universe model created. This fact, that wherever you go in the universe things generally stay the same, is an important concept for cosmological calculations.

Recently researchers from both University College London and Imperial College London have managed to prove that there is a 99.9992% chance that the universe is not isotropic. The came to conclusion by examining the cosmic background radiation data gathered by the Planck satellite to see if, when the branch of Einstein’s Field Equations that assume an irregular universe are applied, they  match this data. This would also reveal if the universe was spinning about a point or had  an even more irregular dynamic to it. Ultimately, this result is important as it revealed that main stream cosmology was once again secure in its assumptions. If these kinds of test are not carried out we may end up wasting fifty years on an endeavour that turns out  to be a false goal.