Citation: Controlling cold molecules (2006, August 29) retrieved 18 August 2019 from https://phys.org/news/2006-08-cold-molecules.html Krems and his student, T. Tscherbul, have written a theoretical paper explaining how cold molecules could be manipulated by an external electric field in their letter titled “Controlling Electronic Spin Relaxtion of Cold Molecules with Electric Fields.” It was published August 22nd in Physical Review Letters.“Cold molecules,” Krems tells PhysOrg.com, “have lots of interesting applications.” Some of these applications include use in quantum computing and looking at time-reversal symmetry in nature. When molecules are cooled to temperatures below 1 K, the experimental realization of these long-standing problems becomes more practicable. Krems wants to work with molecules that are cooled to around ½ K or less. However interesting these applications may be though, what Krems is really interested in is how they can be manipulated in chemical processes. And based on these theoretical results, he believes it should be possible to externally control cold molecules in a magnetic trap with electric fields. This means that collisions between molecules could be manipulated, and greater control over molecular dynamics could be asserted, something that would allow chemists to learn more details about chemical reaction mechanisms and test their chemical reaction theories. It is, however, difficult to thermally isolate molecules in a magnetic trap. “This sort of thing has been done with atoms,” says Krems, “but molecules present a different problem.” He explains that atoms are spherical, and that their magnetic spin does not re-orient after collision. Molecules, though, are a different story. “The problem with molecules is that they are not spherical. Their orientation changes. Applying an electric field may suppress spin re-orientation.” He pauses and then continues: “Being able to control molecular dynamics externally would be a great thing for chemistry.”While the applications to chemistry are what excite Krems, he acknowledges that this new technique could also be helpful to physicists. The new technique using external electric fields to control molecular collisions could help with measurements of electric dipole moment of the electron in time-reversal symmetry experiments, where the idea is to find out whether or not symmetry is the proper order in nature. And, with quantum computing the hot topic of the day, this technique could be helpful in creating new ideas for quantum information processing. “Quantum computing with cold trapped molecules is popular right now. In the next six months I expect to see several new schemes.” And Krems and Tscherbul’s work could help with that.While Krems and Tscherbul’s work is theoretical right now, Krems is fairly certain that it is possible to experimentally confirm the theory in the near future. “I’ve been talking with quite a few people,” he says. “There are a lot of experiments going on right now. I hope that this paper will stimulate experimentalists to include strong electric fields for measurements in their experimental apparatuses.”Krems thinks it is possible that the theory could be confirmed in as little as half a year. “But,” he says, “you never know with these experiments. Surprises may be on the way. We are always waiting for surprises.” Krems’ hopes are certainly high. “This is a very new field and it is expanding rapidly. There is a future for cold molecules in chemistry — a very bright future.By Miranda Marquit, Copyright 2006 PhysOrg.com This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. “This is a chemist’s dream,” explains Roman Krems, a professor at the University of British Columbia in Vancouver, Canada. “We’ve been trying for 50 years to develop mechanisms to control molecular collisions externally.”
Citation: Using superconducting probes to get a picture of what it’s like inside CNTs (2009, November 20) retrieved 18 August 2019 from https://phys.org/news/2009-11-superconducting-probes-picture-cnts.html Mason worked with Travis Dirks and Yung-fu Chen at the University of Illinois, as well as Norman Birge at Michigan State University, to develop a technique to map out changes in conductance through a carbon nanotube quantum dot. “We’re hoping to see what is happening in the interior, rather than what is influenced by the contacts,” Mason explains. “Then we can get at the fundamental electronics of quantum dots, which may be a key to future quantum technologies.” The results of the team’s work can be seen in Applied Physics Letters: “Superconducting tunneling spectroscopy of a carbon nanotube quantum dot.”There are three elements to the technique, according to Mason. “First, there is a carbon nanotube quantum dot, which can act as a model “particle-in-a-box” with quantized energy states. Next, we tunnel to the interior. The non-invasive probe allows us to study the bulk electronics, and also to separately test the effect of voltages across the length of the tube .” The third element is that the tunneling probe is a superconductor. “The superconductor enhances spectroscopic features. But it also shows how this technique is very flexible,” Mason says. “We can try different materials, multiple probes, or magnetic fields, for example.” Some of the spectroscopic features observed with the superconducting probe include signals from cotunneling and unusual scattering processes.Mason points out that elements of this technique have been accomplished before. “However,” she continues, “I think that we are the first to put all the elements together to work as one system, by adding a third terminal and a superconducting probe.” Mason also points out that this set-up works with standard fabrication techniques. “We used lithography, which is common in industry, and easily scalable.”For now, most of the work is focused on fundamental properties of carbon nanotubes. “We are interested in seeing how these nanotube quantum dots work, and tracking what happens in them. We’ve already seen some unexpected features, such as an unusual energy exchange. Using our probe, it is possible to see these features, and explore them in greater depth.”In the future, though, Mason sees the potential for technological applications. These types of quantum dots are being considered for quantum computers and even single electron transistors. There are a number of potential applications for this work, perhaps a decade or so down the road. And the first step is looking into the tube. We want to understand this system so that it might be used in future advanced technologies. Our superconducting tunnel probe will help us do just that.”More information: Dirks, et. al., “Superconducting tunneling spectroscopy of a carbon nanotube quantum dot,” Applied Physics Letters (2009). Available online: http://link.aip.org/link/?APPLAB/95/192103/1 .Copyright 2009 PhysOrg.com. All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com. Explore further This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. (PhysOrg.com) — “Carbon nanotubes are exciting for fundamental physics, and for potential technological applications,” Nadya Mason tells PhysOrg.com. “However, we are generally limited in the way that we can study them. Many of these limitations have to do with controlling tunneling, or the way electrons move on and off the nanotube.” In order to overcome this limitation, Mason, a scientist at the University of Illinois at Urbana-Champaign, participated in an experiment using a superconducting tunnel probe in a carbon nanotube to observe spectroscopic features. Controllable double quantum dots and Klein tunneling in nanotubes
Citation: Spray-on liquid glass is about to revolutionize almost everything (2010, February 2) retrieved 18 August 2019 from https://phys.org/news/2010-02-spray-on-liquid-glass-revolutionize.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The fissure was induced in order present an image which shows the characteristics of the coating. The image shows the SiO2 coating on a filament of a microfibre. Explore further More information: Nanopool: www.nanopool.eu/couk/index.htm• Join PhysOrg.com on Facebook!• Follow PhysOrg.com on Twitter! (PhysOrg.com) — Spray-on liquid glass is transparent, non-toxic, and can protect virtually any surface against almost any damage from hazards such as water, UV radiation, dirt, heat, and bacterial infections. The coating is also flexible and breathable, which makes it suitable for use on an enormous array of products. The liquid glass spray (technically termed “SiO2 ultra-thin layering”) consists of almost pure silicon dioxide (silica, the normal compound in glass) extracted from quartz sand. Water or ethanol is added, depending on the type of surface to be coated. There are no additives, and the nano-scale glass coating bonds to the surface because of the quantum forces involved. According to the manufacturers, liquid glass has a long-lasting antibacterial effect because microbes landing on the surface cannot divide or replicate easily.Liquid glass was invented in Turkey and the patent is held by Nanopool, a family-owned German company. Research on the product was carried out at the Saarbrücken Institute for New Materials. Nanopool is already in negotiations in the UK with a number of companies and with the National Health Service, with a view to its widespread adoption.The liquid glass spray produces a water-resistant coating only around 100 nanometers (15-30 molecules) thick. On this nanoscale the glass is highly flexible and breathable. The coating is environmentally harmless and non-toxic, and easy to clean using only water or a simple wipe with a damp cloth. It repels bacteria, water and dirt, and resists heat, UV light and even acids. UK project manager with Nanopool, Neil McClelland, said soon almost every product you purchase will be coated with liquid glass.Food processing companies in Germany have already carried out trials of the spray, and found sterile surfaces that usually needed to be cleaned with strong bleach to keep them sterile needed only a hot water rinse if they were coated with liquid glass. The levels of sterility were higher for the glass-coated surfaces, and the surfaces remained sterile for months.Other organizations, such as a train company and a hotel chain in the UK, and a hamburger chain in Germany, are also testing liquid glass for a wide range of uses. A year-long trial of the spray in a Lancashire hospital also produced “very promising” results for a range of applications including coatings for equipment, medical implants, catheters, sutures and bandages. The war graves association in the UK is investigating using the spray to treat stone monuments and grave stones, since trials have shown the coating protects against weathering and graffiti. Trials in Turkey are testing the product on monuments such as the Ataturk Mausoleum in Ankara.The liquid glass coating is breathable, which means it can be used on plants and seeds. Trials in vineyards have found spraying vines increases their resistance to fungal diseases, while other tests have shown sprayed seeds germinate and grow faster than untreated seeds, and coated wood is not attacked by termites. Other vineyard applications include coating corks with liquid glass to prevent “corking” and contamination of wine. The spray cannot be seen by the naked eye, which means it could also be used to treat clothing and other materials to make them stain-resistant. McClelland said you can “pour a bottle of wine over an expensive silk shirt and it will come right off”.In the home, spray-on glass would eliminate the need for scrubbing and make most cleaning products obsolete. Since it is available in both water-based and alcohol-based solutions, it can be used in the oven, in bathrooms, tiles, sinks, and almost every other surface in the home, and one spray is said to last a year.Liquid glass spray is perhaps the most important nanotechnology product to emerge to date. It will be available in DIY stores in Britain soon, with prices starting at around £5 ($8 US). Other outlets, such as many supermarkets, may be unwilling to stock the products because they make enormous profits from cleaning products that need to be replaced regularly, and liquid glass would make virtually all of them obsolete. © 2010 PhysOrg.com Nanotechnology Product for Car Windshields Now Available in the USA
More information: lab.rekimoto.org/projects/possessedhand/ TVs, Cell Phones to Learn ‘Sign Language’ Citation: PossessedHand: Technology group develops device to move your fingers for you (2011, June 24) retrieved 18 August 2019 from https://phys.org/news/2011-06-possessedhand-technology-group-device-fingers.html (PhysOrg.com) — In an interesting meshing of robotics and prosthetics development, Japanese researchers from Tokyo University working in conjunction with Sony Corporation, have created an external forearm device capable of causing independent finger and wrist movement. Introduced on the Rekimoto Lab website, the PossessedHand as it’s called can be strapped to the wrist like a blood pressure cuff and fine tuned to the individual wearing it. The PossessedHand sends small doses of electricity to the muscles in the forearm that control movement, and can be “taught” to send preprogrammed signals that replicate the movements of normal wrist and finger movements, such as plucking the strings of a musical instrument. Though the signals sent are too weak to actually cause string plucking, they are apparently strong enough to cause the user to understand which finger is supposed to be moved, thus, the device might be construed to be more of a learning device than an actual guitar accessory.Currently devices that do roughly the same thing are done with electrodes inserted into the skin, or work via gloves worn over the hand, both rather cludgy and perhaps somewhat painful. This new approach in contrast, is said to feel more like a gentle hand massage. Though the original purpose of the PossessedHand seems to be as an aid to help people learn to play musical instruments, something that has inspired a bit of criticism from the musical community due to the fact that nothing is actually learned when using the device; the hand basically becomes an external part of the instrument, while the brain remains passive; it seems clear the device could be used in multiple other ways. For example, it could be used by hearing people to assist in speaking with deaf sign-language users, or to help people type who have never learned how, or perhaps more importantly to help paralyzed people or those suffering from a stroke. Image: Interfaculty Initiative in Information Studies, The University of Tokyo Explore further © 2010 PhysOrg.com In these instances it’s not always imperative that the user actually learn anything new, just that they are able to communicate when they want to. If the programming of the device could be made to work in real time in other ways, by the user, then its value would greatly increase. For example if a person could speak out loud into a microphone and those words could then be captured and translated to sign-language and transferred directly to their fingers, deaf people would instantly be able to communicate with anyone they meet who is willing to wear the cuff. Interfaculty Initiative in Information Studies, The University of Tokyo Image: Interfaculty Initiative in Information Studies, The University of Tokyo This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
Chimp’s stone throwing at zoo visitors was ‘premeditated’ Hopkins and his team have focused their research on chimpanzees, mainly due they say, to the fact that chimps are our closet living relative and that they are the only other species besides humans that regularly throw things with a clear target in mind. He and his team have been watching chimps in action for several years and comparing their actions with scans of their brains to see if there were any correlations between those chimps that threw a lot, and those that didn’t or whether they’re accuracy held any deeper meaning. Surprisingly, they found that chimps that both threw more and were more likely to hit their targets showed heightened development in the motor cortex, and more connections between it and the Broca’s area, which they say is an important part of speech in humans. The better chimp throwers, in other words, had more highly developed left brain hemispheres, which is also, non-coincidently, where speech processing occurs in people.Such findings led the term to suggest that the ability to throw is, or was, a precursor to speech development in human beings.After making their discovery regarding the parts of the brain that appear to be involved in better throwing in chimps, the team tested the chimps and found that those that could throw better also appeared to be better communicators within their group, giving credence to their idea that speech and throwing are related. Interestingly, they also found that the better throwing chimps didn’t appear to posses any more physical prowess than other chimps, which the researchers suggest means that throwing didn’t develop as a means of hunting, but as a form of communication within groups, i.e. throwing stuff at someone else became a form of self expression, which is clearly evident to anyone who has ever been targeted by a chimp locked up in a zoo. Common chimpanzee in the Leipzig Zoo. Image credit: Thomas Lersch, via Wikipedia. (PhysOrg.com) — A lot of people who have gone to the zoo have become the targets of feces thrown by apes or monkeys, and left no doubt wondering about the so-called intellectual capacity of a beast that would resort to such foul play. Now however, researchers studying such behavior have come to the conclusion that throwing feces, or any object really, is actually a sign of high ordered behavior. Bill Hopkins of Emory University and his colleagues have been studying the whole process behind throwing and the impact it has on brain development, and have published their results in Philosophical Transactions of the Royal Society B. Explore further This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Citation: Researches find poop-throwing by chimps is a sign of intelligence (2011, November 30) retrieved 18 August 2019 from https://phys.org/news/2011-11-poop-throwing-chimps-intelligence.html More information: The neural and cognitive correlates of aimed throwing in chimpanzees: a magnetic resonance image and behavioural study on a unique form of social tool use, Phil. Trans. R. Soc. B 12 January 2012 vol. 367 no. 1585 37-47, doi: 10.1098/rstb.2011.0195AbstractIt has been hypothesized that neurological adaptations associated with evolutionary selection for throwing may have served as a precursor for the emergence of language and speech in early hominins. Although there are reports of individual differences in aimed throwing in wild and captive apes, to date there has not been a single study that has examined the potential neuroanatomical correlates of this very unique tool-use behaviour in non-human primates. In this study, we examined whether differences in the ratio of white (WM) to grey matter (GM) were evident in the homologue to Broca’s area as well as the motor-hand area of the precentral gyrus (termed the KNOB) in chimpanzees that reliably throw compared with those that do not. We found that the proportion of WM in Broca’s homologue and the KNOB was significantly higher in subjects that reliably throw compared with those that do not. We further found that asymmetries in WM within both brain regions were larger in the hemisphere contralateral to the chimpanzee’s preferred throwing hand. We also found that chimpanzees that reliably throw show significantly better communication abilities than chimpanzees that do not. These results suggest that chimpanzees that have learned to throw have developed greater cortical connectivity between primary motor cortex and the Broca’s area homologue. It is suggested that during hominin evolution, after the split between the lines leading to chimpanzees and humans, there was intense selection on increased motor skills associated with throwing and that this potentially formed the foundation for left hemisphere specialization associated with language and speech found in modern humans. © 2011 PhysOrg.com Journal information: Philosophical Transactions of the Royal Society B
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The researchers performed full-device simulations to investigate the solar cell’s potential efficiency. For every layer in the modeling, they considered numerous factors, such as material composition, lattice constant, thickness, dielectric constant, electron affinity, band gap, effective conduction and valence band densities, electron and hole motilities, the doping concentration of shallow acceptors and donors, the thermal velocity of electrons and holes, the alloy density, Auger recombination for electrons and holes, direct band-to-band recombination, and how many photons with a specific wavelength are absorbed and reflected by each layer based on its dielectric properties.Accounting for all these factors, the simulations showed that the 3-junction design could achieve an efficiency of 51.8% under 100-suns illumination, a great improvement over the current best 43.5% efficiency under 418-suns illumination. All three subcells in the new design had a maximum external quantum efficiency of 80% and absorbed light from a wide range of the spectrum.”The multijunction solar cells are tested under different numbers of suns because they are often used in concentrator photovoltaic systems, which allow us to reduce the size or number of cells needed,” Leite explained. “These strategies tolerate the use of more expensive semiconductor materials, which would otherwise be cost-prohibitive. The results can certainly be compared with each other, as long as the illumination sources are well calibrated.”The researchers also built a proof-of-principle solar cell with an equivalent design, which they fabricated on an indium phosphide (InP) substrate. The solar cell was not optimized, so its efficiency was far from the theoretical prediction, but the results nevertheless demonstrated the capability of experimentally realizing the design. The scientists predict that, with further improvements, this equivalent 3-junction solar cell could have a practical efficiency of around 20% under 1-sun illumination. “[The fabricated solar cell] presents a poor current match but demonstrates our capability of growing high quality semiconductor compounds with an extremely low density of defects and stoichiometry very close to what is required for the optimized design,” Leite said. “The bandgap-optimized design is formed by the same class of alloys, and has a great current match. So, upon optimization of anti-reflection coatings and other design parameters, the simulations indicate that one can reach greater than 50% under concentrated sunlight.”In addition to an optimized anti-reflection coating, some of the other improvements may involve adding window and back surface layers to reduce loss and thickening the lower two subcells to absorb long wavelength light more completely.”I am very excited about our initial results concerning a bandgap-optimized design,” Leite said. “In the near future I plan to work on the integration of the optimized design into the single crystal template in order to fabricate a first monolithic (1.93 eV)InAlAs/(1.39 eV) InGaAsP/(0.94 eV)InGaAs solar cell. Simultaneously, we are looking into anti-reflection coating options for the InAlAs top subcell, which will require an oxygen-free material or the combination of an oxide and a sulfide as a protective layer.” More information: Marina S. Leite, et al. “Towards an optimized all lattice-matched InAlAs/InGaAsP/InGaAs multijunction solar cell with efficiency >50%.” Applied Physics Letters 102, 033901 (2013) (Phys.org)—Scientists have designed a new multijunction solar cell that, in simulations, can achieve an efficiency of 51.8%. This high performance exceeds the current goal of 50% efficiency in multijunction solar cell research as well as the current world record of 43.5% for a 3-junction solar cell. Citation: Multijunction solar cell could exceed 50% efficiency goal (2013, February 20) retrieved 18 August 2019 from https://phys.org/news/2013-02-multijunction-solar-cell-efficiency-goal.html Copyright 2013 Phys.org All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of Phys.org. Journal information: Applied Physics Letters New multi-junction solar cell could break efficiency barrier Explore further The new multijunction solar cell design has three subcells that each have different band gaps to absorb different parts of the solar spectrum. The scientists focused on improving the current match and the lattice match among the subcells to achieve the highest simulated efficiency for this type of solar cell to date. Credit: Marina S. Leite, et al. ©2013 American Institute of Physics The work was performed by a collaboration of researchers from the California Institute of Technology in Pasadena; the National Institute of Standards and Technology in Gaithersburg, Maryland; the University of Maryland in College Park; and Boeing-Spectrolab, Inc., in Sylmar, California. The team published a paper on their work in a recent issue of Applied Physics Letters.As the researchers explain, multijunction solar cells are one of the most promising devices for efficiently converting sunlight into electricity. In multijunction solar cells, each junction or subcell absorbs and converts sunlight from a specific region of the spectrum. The subcells can be stacked on top of one another so that sunlight first strikes the highest bandgap subcell, which is tuned to light with the shortest wavelengths or highest energies. The longer wavelengths pass through the first subcell and strike the lower bandgap subcells.This arrangement offers a significant advantage over single-junction solar cells, which have a maximum theoretical efficiency of only 34%. In theory, an “infinite-junction” solar cell has a maximum theoretical efficiency of almost 87%. But to approach this level, multijunction solar cells not only need multiple subcells, but optimal semiconductor materials for the subcells to provide a combination of band gaps that cover as much of the solar spectrum as possible.To improve upon the current best multijunction solar cells, the researchers here focused on improving the current match between the different subcells, along with using a lattice-matched design. Both of these factors have previously limited multijunction solar cell efficiency.”The lattice match corresponds to the matching between the crystal unit cells from the different subcells,” lead author Marina Leite, an energy researcher at the National Institute of Standards and Technology, told Phys.org. “By using subcells that are lattice-matched, we can minimize dislocations and other crystal defects that can significantly affect the performance of the device. A current match is required for two-terminal tandem configurations because in this case a single current passes through all the subcells and the voltages are added; therefore, if one subcell has less photocurrent it will limit the current generated by the entire device. The current match is desired so that each individual subcell works at its own maximum power point of operation.”
This is the first plot showing the predicted bottom-quark forward-backward asymmetry (in percentage) plotted against the energy (in GeV units) of the bottom quark-antiquark pair produced in the proton-antiproton collision at the Tevatron. Orange is the prediction of the Standard Model. The other colors correspond to predictions from proposed extensions of the Standard Model that add a new particle called an “axigluon,” which was proposed to explain the anomaly in the observed top-quark forward-backward asymmetry. The different colors correspond to different parameters assumed for the axigluon (the parameters are the mass of the axigluon, MG, and the strength of the interaction of the axigluon with the top quark, g). The blue plot deviates strongly from the orange, so experiments should be able to tell the difference. The other colors do not deviate that much from the orange, so more work or a different method would be needed if the axigluon has parameters like those (but the blue line is for parameters that best explain the top-quark case). Credit: Grinstein and Murphy. ©2013 American Physical Society (Phys.org) —While scientists have become increasingly convinced that the Standard Model of particle physics is incomplete, it’s still unclear exactly how the Standard Model needs to be extended. Experiments have shown that the Standard Model cannot explain certain top quark observations, but a variety of extensions of the Standard Model have been proposed to explain them, and it’s unclear which extension is correct. In a new paper published in Physical Review Letters, physicists Benjamín Grinstein and Christopher W. Murphy at the University of California, San Diego, have explained how upcoming data on the bottom quark can be used to distinguish between competing new physics explanations of unexpected top quark data. Journal information: Physical Review Letters © 2013 Phys.org. All rights reserved. If new physics is involved, as the physicists expect, then the bottom-quark forward-backward asymmetry might be larger than predicted by the Standard Model, or the asymmetry may even be reversed. “The Standard Model of electroweak and strong interactions predicts a very small bottom-quark forward-backward asymmetry at the Tevatron, of the order of a few percent,” Grinstein said. “What we have shown in our work is that new physics can change this number dramatically. One of the interesting features we discovered is that when the energy of the b-quark and b-antiquark sum to the rest energy of the Z-boson (one of the particles responsible for weak interactions), the asymmetry is enhanced. We furthermore showed that, at this particular energy, the effects of new physics can be greatly amplified. For example, in one popular class of models the sign of the asymmetry is reversed, relative to that predicted by the Standard Model, in the energy region corresponding to the Z-boson’s rest energy.”The plots in the physicists’ paper tell a more detailed story of the possibilities for new physics. The two plots included here show the predicted bottom-quark forward-backward asymmetry (in percentage) plotted against the energy (in GeV units) of the bottom quark-antiquark pair produced in the proton-antiproton collision at the Tevatron. In both plots, the orange line represents the Standard Model prediction, while the other colors correspond to predictions from proposed extensions of the Standard Model. The plots are not continuous, but instead they are bar graphs in which the quark pairs of energies in given “bins” are collected together. The black vertical bars indicate what the CDF experiment at Fermilab predicts as its sensitivity, meaning they will be able to distinguish between colored lines that are separated by more than the size of the black bar. Explore further “There has been much excitement the last couple of years precipitated by reports by the two experimental collaborations working at the Tevatron (at Fermilab, outside Chicago) that a much larger-than-expected top-quark forward-backward asymmetry is seen,” Grinstein told Phys.org. “Several models have been proposed to explain this unexpected result. Our paper suggests a way to distinguish among the various models that have been proposed, since these models give very different bottom-quark forward-backward asymmetries. When a sufficiently precise measurement of the bottom-quark forward-backward asymmetry is performed, we will be able to narrow down significantly the new physics that the Tevatron experiments seem to have uncovered.”But perhaps more importantly, observations of the bottom-quark forward-backward asymmetry in disagreement with expectations from the Standard Model, when put together with the top-quark forward-backward asymmetry, would demonstrate fairly conclusively that there is new physics in the form of new particles and interactions not included in the Standard Model, and would point the way toward its direct experimental confirmation. So, as you can see, this would go to the heart of the question in particle physics.”As the physicists explain, a quark’s forward-backward asymmetry refers to the likelihood that the quark is moving in the forward or backward direction after it is produced in a proton-antiproton collision.”The Tevatron is a large circular particle accelerator in which protons and antiprotons travel in opposite directions,” Murphy said. “The direction of travel of a proton at the point it collides with an antiproton is called the ‘forward’ direction. Often a b-quark and an anti-b-quark are produced as a result of the proton-antiproton collision. There are several ways to define a ‘bottom-quark forward-backward asymmetry,’ but they all are a measure of how more (or less) likely it is for the produced b-quark to be moving preferentially in the forward direction. For example, one may count the number of collisions with a forward-moving b-quark, subtract the number of collisions with a backwards-moving b-quark, and divide this by the total number of collisions that produce b-quarks. It should be noted that the asymmetry is not just a single number because it can be determined for various values of the energies of the produced b-quarks. So in fact the asymmetry is a function of the energy of the b-quarks.” This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Citation: Quark asymmetries hint at physics beyond the Standard Model (2013, August 26) retrieved 18 August 2019 from https://phys.org/news/2013-08-quark-asymmetries-hint-physics-standard.html More information: Benjamín Grinstein and Christopher W. Murphy. “Bottom-Quark Forward-Backward Asymmetry in the Standard Model and Beyond.” PRL 111, 062003 (2013). DOI: 10.1103/PhysRevLett.111.062003 Discovery of rare decay narrows space for new physics This is the second plot showing the predicted bottom-quark forward-backward asymmetry (in percentage) plotted against the energy (in GeV units) of the bottom quark-antiquark pair produced in the proton-antiproton collision at the Tevatron. Again, orange is the prediction of the Standard Model. Now the other colors are for a model of “scalar weak doublets.” The particles in this model are also characterized by two parameters: a mass and an interaction strength. Interestingly, on the bin that includes the Z-boson, the sign of the asymmetry is reversed for the scalar weak doublets model. The effect is large enough that the experiment should easily distinguish the new physics, if present. Credit: Grinstein and Murphy. ©2013 American Physical Society
For many years, Earth scientists and others have used Easter Island and its inhabitants, the Rapa Nui, as a lesson in what can happen when a parcel of land is overpopulated and thus overused—resources diminish and the people starve to death (or resort to cannibalism as some have suggested). But now, the researchers with this new effort suggest that thinking may be wrong.Scientists believe Polynesians first settled on Easter Island sometime around 1200 AD—over the course of the next several hundred years the settlers became the Rapa Nui, famous for the massive maoi statues that were erected. Over that time period, the people cut down most of the trees on the northern part of the island and a lot of the other vegetation. That led to the loss of nutrient rich topsoil due to erosion and the idea that the people began to starve to death.To better understand what actually occurred both before and after Europeans arrived in the 1700’s, the researchers used a technique known as obsidian hydration dating on artifacts found at various sites on the northern part of the island where the Rapa Nui lived. That allowed them to gain insights into how the land in that area had been used during different time periods. From that they were able to construct a timeline that showed where the people were living over the course of hundreds of years. And that, the researchers report, showed that rather than a population crash due to starvation, there were population shifts that reflected changing weather patterns. Some areas did see population losses before European contact, and some actually saw initial gains afterwards. The population did see a dramatic decline, of course, sometime thereafter as the Rapa Nui people became exposed to European diseases such as smallpox and syphilis and as many were taken and sold into slavery. This means, the team concludes, that there is little evidence of population collapse prior to European contact. (Phys.org)—A team of researchers with members from the U.S., Chile and New Zealand has uncovered evidence that contradicts the conventional view of the demographic collapse of the Rapa Nui people living on Easter Island, both before and after European contact. In their paper published in Proceedings of the National Academy of Sciences, the team describes how they conducted obsidian hydration dating of artifacts from the island to trace the history of human activity in the area and what they found in doing so. More information: Variation in Rapa Nui (Easter Island) land use indicates production and population peaks prior to European contact, Christopher M. Stevenson, PNAS, DOI: 10.1073/pnas.1420712112AbstractMany researchers believe that prehistoric Rapa Nui society collapsed because of centuries of unchecked population growth within a fragile environment. Recently, the notion of societal collapse has been questioned with the suggestion that extreme societal and demographic change occurred only after European contact in AD 1722. Establishing the veracity of demographic dynamics has been hindered by the lack of empirical evidence and the inability to establish a precise chronological framework. We use chronometric dates from hydrated obsidian artifacts recovered from habitation sites in regional study areas to evaluate regional land-use within Rapa Nui. The analysis suggests region-specific dynamics including precontact land use decline in some near-coastal and upland areas and postcontact increases and subsequent declines in other coastal locations. These temporal land-use patterns correlate with rainfall variation and soil quality, with poorer environmental locations declining earlier. This analysis confirms that the intensity of land use decreased substantially in some areas of the island before European contact. © 2015 Phys.org Journal information: Proceedings of the National Academy of Sciences Explore further A Rapa Nui Rock Garden, or agricultural field, with Poike volcano in the background. Credit: Christopher M. Stevenson This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Citation: Study suggests history of Rapa Nui on Easter Island far more complex than thought (2015, January 6) retrieved 18 August 2019 from https://phys.org/news/2015-01-history-rapa-nui-easter-island.html Genomic data support early contact between Easter Island and Americas
Stellar flares are energetic and impulsive releases of large amounts of energy from a star. They occur when a shift in the star’s magnetic field accelerates electrons to speeds approaching that of light, which results in eruptions producing emission across the entire electromagnetic spectrum.While flares from M stars provide some of the most dramatic stellar events, they are difficult to predict. Spotting such activity on this type of object requires long-duration measurements of many stars, which can be provided, for instance, by wide-field surveys for transiting exoplanets.Recently, a team of astronomers led by James Jackman of University of Warwick, U.K. has analyzed observational data collected by NGTS between November 2015 and August 2016. NGTS is a ground-based transiting exoplanet survey consists of 12 telescopes. The survey is able to detect and resolve flares on both single and blended objects.NGTS allowed Jackman’s team to detect a flare on NGTS J121939.5-355557 (NGTS J1219-3555 for short) on January 31, 2016. Located some 685 light years away from the Earth, J1219-3555 is a young (around 2.2 million years old) star of spectral type M3 about the size of our sun, however slightly more than five times less massive. It has an effective temperature of 3,090 K.”In this work, we have detected a high-energy stellar flare from the 2 Myr old pre-main sequence M star NGTS J121939.5-355557 with NGTS,” the researchers wrote in the paper.According to the study, this flare had an energy of 3.2 undecillion erg and a maximum amplitude of 7.2. The astronomers noted that this energy is greater than all M dwarf flares observed with NASA’s Kepler space telescope and is comparable to that emitted by the highest energy G star superflares. They added that the newly spotted flare is one of the largest energy M star flares ever observed.Furthermore, in the flare peak the researchers found significant multi-mode quasi-periodic pulsations (QPPs). The team underlined the importance of this finding as although such pulsations are commonly observed in solar flares, they remain relatively rare in stellar flare observations.The QPPs in the flare of NGTS J1219-3555 are formed of two statistically significant periods of approximately 320 and 660 seconds, with an oscillation amplitude of 0.1. The astronomers added that these values make the flare described in the paper one of the largest amplitude events to exhibit such pulsations.In concluding remarks, the researchers underlined the importance of wide field, long timescale surveys such as NGTS in the search for high-energy events like the flare in NGTS J1219-3555. Finding and studying these flares could be important for improving our understanding of formation and habitability of Earth-like alien worlds around M-type stars. Zoom in of the flare peak, in which oscillations are clearly seen. A flux spike, lasting only about 20-30 seconds, is seen at the beginning of the oscillations approximately 8 minutes after the night start. A green interpolating line is shown to aid the eye. Credit: Jackman et al., 2018. Using the Next Generation Transit Survey (NGTS), astronomers have identified an energetic flare displaying quasi-periodic pulsations on the pre-main sequence M star NGTS J121939.5-355557. The newly detected flare is one of the most energetic flares seen on an M-type star to date. The finding is reported in a paper published November 5 on arXiv.org. Citation: Giant flare detected on a pre-main sequence M star (2018, November 13) retrieved 18 August 2019 from https://phys.org/news/2018-11-giant-flare-pre-main-sequence-star.html © 2018 Science X Network Explore further More information: James A.G. Jackman et al. Detection of a giant flare displaying quasi-periodic pulsations from a pre-main sequence M star with NGTS. arXiv:1811.02008 [astro-ph.SR]. arxiv.org/abs/1811.02008 Powerful flare detected on an M-dwarf star This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
by NPR News Howard Berkes, Robert Benincasa 8.22.19 11:09am An audit of the federal system that fines mining companies for unsafe conditions found no evidence that more than $1 billion in mine safety penalties over 18 years deterred unsafe mining practices.The four-year long audit from the Office of Inspector General of the Labor Department says its analysis of Mine Safety and Health Administration accident and violations data “showed no correlation between penalties paid and the safety of mine operations.”The audit was prompted by a 2014 NPR investigation of thousands of mines and mining companies that did find such a connection. Specifically, NPR found that mines that persistently ignored their penalties had injury rates 50 percent higher than mines that paid their fines. In total, NPR examined the safety records of mines that had failed to pay nearly $70 million in penalties, some with delinquent fines that were decades old.The auditors instead reported data for mining companies, not individual mines, and in a statement to NPR acknowledged taking a different approach with their review.”We focused on mine operators as they are responsible for the safety of miners and are assigned the financial responsibility for penalties,” the OIG’s statement explained.The office did not respond to specific questions NPR submitted and declined a request for an interview.The audit doesn’t say whether the measures of safety and violations applied only after mines or mining companies failed to pay safety fines or while they continued to be delinquent. NPR’s analysis of delinquent fines and safety applied only to mines while they were delinquent.The auditors also measured safety by looking at raw numbers and averages of “serious accidents” and “serious violations,” which is not the primary safety metric used by MSHA. The agency instead typically uses “incident” or injury rate, which takes into account the number of injuries that occur in a mine during the hours worked.This shows that “penalties just aren’t high enough to deter bad behavior,” says Wes Addington, the executive director of the Appalachian Citizens Law Center in Whitesburg, Kentucky.Addington once conducted his own analysis of delinquent mine safety penalties and their impacts on safety.The audit shows that “violations are just a cost of doing business,” Addington adds.He also calls the audit “superficial” and “poorly designed” because it mixes coal mines with what are called metal and nonmetal mines. “Coal mining is one of the most dangerous occupations in the United States,” Addington says. Including other mines in the analysis “skews the data. They don’t have the injuries and violations coal mines have.”The auditors did recommend that MSHA not permit mining companies to operate new mines if they have outstanding penalties at existing operations.”Without holding mine operators accountable for their safety record or delinquency status prior to commencing operations at a new mine, mine operators have less of an incentive to prevent future safety hazards,” the audit says.MSHA says it does not have the legal authority to deny mine operators the ability to open new mines due to unpaid fines.The NPR investigation found that mining companies continued to operate multiple mines, and continued to put thousands of miners at risk, even after persistent failure to pay mine safety fines. In one case, the operators of the Kentucky Darby mine amassed nearly $3 million in safety fines after a deadly accident that killed five miners. That includes $500,000 in fines the owners of the mine failed to pay for safety failures that contributed to the tragedy.In that and many other cases, NPR found that MSHA failed to use some of its authority to force companies to pay and to seek closure of mines that were persistently delinquent.The auditors found that MSHA has used that authority more often since the NPR investigation in 2014, and noted more aggressive focus on delinquent and unsafe mines in both the Trump and Obama administrations.The audit said MSHA deserves credit for collecting the vast majority of mine safety fines – roughly 90 percent in the 18 years reviewed.But the auditors also recommended that MSHA develop ways to measure the effectiveness of penalties. MSHA says that’s difficult because fines are “one of many variables” used to make mines safe.MSHA has recently resisted public disclosure of mining companies that fail to pay penalties. NPR sued MSHA in federal court last year under the Freedom of Information Act after the agency failed to provide its current delinquent mines data. A settlement resulted in the release of that data, which showed that the nation’s top delinquent mine owners are companies owned by West Virginia Governor Jim Justice and his family.In a rare move in May, MSHA and the Justice Department sued the coal companies owned by the Justice family for failure to pay $4.7 million in unpaid mine safety fines.Copyright 2019 NPR. To see more, visit NPR. Mines No Safer Despite $1 Billion In Fines, Federal Audit… Wade Payne