|
NEWS
Apr 30, 2009 3:31:05 GMT 4
Post by nodstar on Apr 30, 2009 3:31:05 GMT 4
Sally Anne ..
A PM awaits you ;D
lotsa love
Nod
|
|
|
NEWS
Apr 30, 2009 4:28:39 GMT 4
Post by ownurfreewill on Apr 30, 2009 4:28:39 GMT 4
|
|
|
NEWS
Apr 30, 2009 5:28:18 GMT 4
Post by ninathedog on Apr 30, 2009 5:28:18 GMT 4
thank you Nod*!these bits of advice make the problem much more manageable both mentally and emotionally! we control the outcome in our lives, those goobers can't hold us down! I have to say that the Dr. Horowitz video had a lot of information but he spoke so quickly and profusely that it was a little scary in tone, so it was hard for me to really absorb what he was saying. Mexican Flu Outbreak 2009: SPECIAL REPORT by Dr Leonard Horowitz(Dr. Horowitz also wrote the Forward for "EMANATION OF THE SOLFEGGIO" by Drs. McDowell & Burisch) Congressman (and M.D.) Dr. Ron Paul on the Recent Swine Flu Scare Thank you, Own~Our~Free~Will!On another note, I'm pretty sure that the very sorrow-filled photo that Dr. Horowitz used to illustrate the concept of "genocide" was a photo that my Gazan friend took when the mother, toddler and infant were wrapped for their burial. Their home had been shelled by the Israeli military, or maybe it was a bomb, I thought it was from late 2006 because certain photos do stick in my mind and this was certainly one of them but I can't find it. His website is here, rafahtoday.org -- if anyone wants to look for the photo because I wasn't able to find it (but please use discretion as the reports and photos are often horrific) anyway back to this pesky virus... thank you again, Nod*
|
|
|
NEWS
Apr 30, 2009 9:29:24 GMT 4
Post by satchmo on Apr 30, 2009 9:29:24 GMT 4
Italy Seizes Millions in Assets From Four BanksWith municipal bond investigations spreading to Europe from the United States, Italian authorities have seized about $300 million in assets of four global banks — JPMorgan Chase, Deutsche Bank, UBS and Depfa — whose officials have been accused of fraud. The Guardia di Finanza in Milan, the financial police of Italy, took over real estate properties, bank accounts and stock holdings on Monday to assure it could collect from the banks if their officials were found guilty and the banks were held responsible. The seizures stem from the banks’ handling of a $2.2 billion municipal bond issue and related financial contracts known as swaps that Milan undertook to retire other debt in June 2005. The lead prosecutor accused the bankers of misleading the city and falsely claiming that the deal would generate savings. If all the costs had been properly included, the prosecutor said, the entire deal would have been illegal under a national law that allows restructuring of debt only if it produces a savings. Alfredo Robledo, the prosecutor in Milan, suspects the banks made $130 million in illicit profits, according to information obtained in a joint investigation by the Italian business newspaper Il Sole 24 Ore and The International Herald Tribune. He is also investigating transactions by the banks with other local Italian governments and the possibility that public officials received kickbacks. About 35 billion euros ($46 billion) in bonds were issued by local Italian governments over the last decade, mostly by the London units of large banks based in the United States and Europe. A former executive from one of the banks being investigated in Milan said that all of these could be subject to challenge. Representatives of each of the four banks declined to comment. JPMorgan is based in New York, Deutsche Bank in Frankfurt, UBS in Zurich and Depfa is a unit of Hypo Real Estate in Munich. Three of the banks are also being investigated over their municipal bond practices in the United States. Officials or former officials of JPMorgan Chase, Deutsche Bank and UBS, along with the institutions themselves, are the subjects of investigations, company filings and documents filed in civil cases show. In its annual report released last month, JPMorgan Chase acknowledged parallel investigations in the United States by the Justice Department and the Securities and Exchange Commission into possible antitrust and securities violations involving derivatives sold to local governments. JPMorgan said it was cooperating with the investigations and had provided documents. On both sides of the Atlantic, the banks and their executives have been accused of misleading local governments and selling officials exotic financial products known as derivatives that they did not fully understand. These derivatives, when combined with bond offerings, were presented as ways to raise cash and reduce the long-term cost of debt, but officials claim now that many of the contracts, in the form of swaps, were packed with millions of dollars in fees that were not disclosed. In his filings to a judge in Italy seeking the asset seizure, Mr. Robledo asserted that the bankers falsely claimed that the deal would save 57.3 million euros. While charging only nominal fees to show the refinancing would be beneficial, he said, the bank then hid their profits in the spread between what the city paid to the banks and what the banks gave in return on swaps contracts that accompanied the bond issue — a difference of 52.7 million euros. The original deal was rescheduled five times until October 2007 and produced an additional 48 million euros of profit for the banks. In total, the banks earned about 101 million euros in such payments over a two-year period. The largest share went to JPMorgan Chase, which the prosecutor said took in almost 45 million euros. It is too soon to tell whether the long-term cost of the deal and the swaps contracts, which carry more risk than a plain-vanilla bond offering, will be even higher. The jurisdiction of this case is in dispute. Last January, days after Milan announced that it was suing the banks in civil court, JPMorgan filed a countersuit in the High Court in London to have the claim heard there instead. In their presentations, the banks noted that they were regulated by the Financial Securities Authority in Britain, an agency similar to the S.E.C., and that their contracts were subject to British laws. But the Italian investigators argue that they have authority to investigate any fraud. British law “would be applicable to civil proceedings, not a criminal one, such as this,” said an investigator involved in the case who said he was not authorized to speak about a continuing investigation. According to the Italian magistrate, British rules may have been violated as well. Citing the opinion of David Dobell, a former British financial regulator who is now a partner in CCL, a compliance consultancy, the prosecutor claimed that the banks breached their fiduciary duties as defined by the F.S.A. when dealing with a nonprofessional customer like Milan. These and other similar transactions could be invalidated if the banks breached those duties, requiring the banks to disgorge their profits from the deals and pay damages. Beside the 10 bankers, Giorgio Porta, Milan’s general manager at the time of the deal, and Mario Mauri, then a financial adviser to the mayor, are also under investigation. The Italian prosecutor could soon request help from the British regulator in determining whether intermediary or consultant fees were paid by the banks and to whom. www.theledger.com/article/20090428/ZNYT01/904283014?Title=Italy-Seizes-Millions-in-Assets-From-Four-Banks satchmo
|
|
|
NEWS
May 1, 2009 0:30:51 GMT 4
Post by kiek on May 1, 2009 0:30:51 GMT 4
|
|
|
NEWS
May 1, 2009 0:41:35 GMT 4
Post by galaxygirl on May 1, 2009 0:41:35 GMT 4
Hey Golden Threaders . . . surely do appreciate the broad spectrum of info about the hybrid flu that's been posted here and the level-headed way that it's being discussed. Nodstar* . . . yer background as a healthcare professional is a great asset right now Keep bringing it on! I have been receiving soooo many emails from different sources . . . even other countries . . . saying how the flu situation is contrived and being hyped into a frenzy by the US media. One email in particular (from Geneva, Switzerland) gave Len Horowitz a big thumbs-down for his YouTube "infomercial" on the flu. Seems like this particular issue is serving as another avenue for exposing more of the Shadow as we're growing Lighter and more Transparent. No place for anything to hide these days!! ;D GG
|
|
|
NEWS
May 1, 2009 4:03:01 GMT 4
Post by malynda on May 1, 2009 4:03:01 GMT 4
I am really feeling positive right now. It's great to see so many people from all walks of life beginning to "wake up". Pretty much everyone I have spoken to, like galaxygirl, from all over the world think the flu is nothing to worry about. They see how they are trying to scare us, especially with the NYC flyover. They seem to be going too big in their game and the little curtain shrouding the Great OZ seems to be slipping. It's a beautiful thing! =D
Thanks to all who bring us the news here every day, especially God, you know who you are. I've been seeing an interesting pattern emerge and have been trying to get it all down in words to post here. Hopefully I can have it done by the weekend and am brave enough to post it.
Hope all are well!
|
|
|
NEWS
May 1, 2009 4:27:40 GMT 4
Post by towhom on May 1, 2009 4:27:40 GMT 4
Comparative Component Analysis of Exons with Different Splicing FrequenciesPLoS ONE Received: November 30, 2008; Accepted: March 31, 2009; Published: April 30, 2009www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0005387AbstractTranscriptional isoforms are not just random combinations of exons. What has caused exons to be differentially spliced and whether exons with different splicing frequencies are subjected to divergent regulation by potential elements or splicing signals? Beyond the conventional classification for alternatively spliced exons (ASEs) and constitutively spliced exons (CSEs), we have classified exons from alternatively spliced human genes and their mouse orthologs (12,314 and 5,464, respectively) into four types based on their splicing frequencies. Analysis has indicated that different groups of exons presented divergent compositional and regulatory properties. Interestingly, with the decrease of splicing frequency, exons tend to have greater lengths, higher GC content, and contain more splicing elements and repetitive elements, which seem to imply that the splicing frequency is influenced by such factors. Comparison of non-alternatively spliced (NAS) mouse genes with alternatively spliced human orthologs also suggested that exons with lower splicing frequencies may be newly evolved ones which gained functions with splicing frequencies altered through the evolution. Our findings have revealed for the first time that certain factors may have critical influence on the splicing frequency, suggesting that exons with lower splicing frequencies may originate from old repetitive sequences, with splicing sites altered by mutation, gaining novel functions and become more frequently spliced.Complete article available for download in multiple formats at the link displayed above.
|
|
|
NEWS
May 1, 2009 4:41:53 GMT 4
Post by towhom on May 1, 2009 4:41:53 GMT 4
Monomeric Bistability and the Role of Autoloops in Gene RegulationPLoS ONE Received: January 22, 2009; Accepted: March 23, 2009; Published: April 30, 2009www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0005399AbstractGenetic toggle switches are widespread in gene regulatory networks (GRN). Bistability, namely the ability to choose among two different stable states, is an essential feature of switching and memory devices. Cells have many regulatory circuits able to provide bistability that endow a cell with efficient and reliable switching between different physiological modes of operation. It is often assumed that negative feedbacks with cooperative binding (i.e. the formation of dimers or multimers) are a prerequisite for bistability. Here we analyze the relation between bistability in GRN under monomeric regulation and the role of autoloops under a deterministic setting. Using a simple geometric argument, we show analytically that bistability can also emerge without multimeric regulation, provided that at least one regulatory autoloop is present.Complete article available for download in multiple formats at the link displayed above.
|
|
|
NEWS
May 1, 2009 5:10:27 GMT 4
Post by towhom on May 1, 2009 5:10:27 GMT 4
A Differential Wiring Analysis of Expression Data Correctly Identifies the Gene Containing the Causal MutationPLoS Computational Biology Received: October 7, 2008; Accepted: April 1, 2009; Published: May 1, 2009 www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1000382;jsessionid=833CC6D4375FC6D3DCF1A278DC01E9FEAbstractTranscription factor (TF) regulation is often post-translational. TF modifications such as reversible phosphorylation and missense mutations, which can act independent of TF expression level, are overlooked by differential expression analysis. Using bovine Piedmontese myostatin mutants as proof-of-concept, we propose a new algorithm that correctly identifies the gene containing the causal mutation from microarray data alone. The myostatin mutation releases the brakes on Piedmontese muscle growth by translating a dysfunctional protein. Compared to a less muscular non-mutant breed we find that myostatin is not differentially expressed at any of ten developmental time points. Despite this challenge, the algorithm identifies the myostatin ‘smoking gun’ through a coordinated, simultaneous, weighted integration of three sources of microarray information: transcript abundance, differential expression, and differential wiring. By asking the novel question “which regulator is cumulatively most differentially wired to the abundant most differentially expressed genes?” it yields the correct answer, “myostatin”. Our new approach identifies causal regulatory changes by globally contrasting co-expression network dynamics. The entirely data-driven ‘weighting’ procedure emphasises regulatory movement relative to the phenotypically relevant part of the network. In contrast to other published methods that compare co-expression networks, significance testing is not used to eliminate connections. Author SummaryEvolution, development, and cancer are governed by regulatory circuits where the central nodes are transcription factors. Consequently, there is great interest in methods that can identify the causal mutation/perturbation responsible for any circuit rewiring. The most widely available high-throughput technology, the microarray, assays the transcriptome. However, many regulatory perturbations are post-transcriptional. This means that they are overlooked by traditional differential gene expression analysis. We hypothesised that by viewing biological systems as networks one could identify causal mutations and perturbations by examining those regulators whose position in the network changes the most. Using muscular myostatin mutant cattle as a proof-of-concept, we propose an analysis that succeeds based solely on microarray expression data from just 27 animals. Our analysis differs from competing network approaches in that we do not use significance testing to eliminate connections. All connections are contrasted, no matter how weak. Further, the identity of target genes is maintained throughout the analysis. Finally, the analysis is ‘weighted’ such that movement relative to the phenotypically most relevant part of the network is emphasised. By identifying the question to which myostatin is the answer, we present a comparison of network connectivity that is potentially generalisable. Complete article available for download at the link displayed above.
This is an interesting concept.
|
|
|
NEWS
May 1, 2009 5:23:31 GMT 4
Post by towhom on May 1, 2009 5:23:31 GMT 4
Representation of Time-Varying Stimuli by a Network Exhibiting Oscillations on a Faster Time ScalePLoS Computational Biology Received: August 27, 2008; Accepted: March 20, 2009; Published: May 1, 2009 www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1000370;jsessionid=833CC6D4375FC6D3DCF1A278DC01E9FEAbstractSensory processing is associated with gamma frequency oscillations (30–80 Hz) in sensory cortices. This raises the question whether gamma oscillations can be directly involved in the representation of time-varying stimuli, including stimuli whose time scale is longer than a gamma cycle. We are interested in the ability of the system to reliably distinguish different stimuli while being robust to stimulus variations such as uniform time-warp. We address this issue with a dynamical model of spiking neurons and study the response to an asymmetric sawtooth input current over a range of shape parameters. These parameters describe how fast the input current rises and falls in time. Our network consists of inhibitory and excitatory populations that are sufficient for generating oscillations in the gamma range. The oscillations period is about one-third of the stimulus duration. Embedded in this network is a subpopulation of excitatory cells that respond to the sawtooth stimulus and a subpopulation of cells that respond to an onset cue. The intrinsic gamma oscillations generate a temporally sparse code for the external stimuli. In this code, an excitatory cell may fire a single spike during a gamma cycle, depending on its tuning properties and on the temporal structure of the specific input; the identity of the stimulus is coded by the list of excitatory cells that fire during each cycle. We quantify the properties of this representation in a series of simulations and show that the sparseness of the code makes it robust to uniform warping of the time scale. We find that resetting of the oscillation phase at stimulus onset is important for a reliable representation of the stimulus and that there is a tradeoff between the resolution of the neural representation of the stimulus and robustness to time-warp. Author SummarySensory processing of time-varying stimuli, such as speech, is associated with high-frequency oscillatory cortical activity, the functional significance of which is still unknown. One possibility is that the oscillations are part of a stimulus-encoding mechanism. Here, we investigate a computational model of such a mechanism, a spiking neuronal network whose intrinsic oscillations interact with external input (waveforms simulating short speech segments in a single acoustic frequency band) to encode stimuli that extend over a time interval longer than the oscillation's period. The network implements a temporally sparse encoding, whose robustness to time warping and neuronal noise we quantify. To our knowledge, this study is the first to demonstrate that a biophysically plausible model of oscillations occurring in the processing of auditory input may generate a representation of signals that span multiple oscillation cycles. This article has a broader spectrum of relevance than just "sensory processing".
|
|
|
NEWS
May 1, 2009 5:38:21 GMT 4
Post by towhom on May 1, 2009 5:38:21 GMT 4
Atomic physics study sets new limits on hypothetical new particlesUniversity of Nevada, Reno team findings to be published in Physical Review Letters articleEurekAlert Public Release: 30-Apr-2009www.eurekalert.org/pub_releases/2009-04/uonr-aps043009.phpIn a forthcoming Physical Review Letters article, a group of physicists at the University of Nevada, Reno are reporting a refined analysis of experiments on violation of mirror symmetry in atoms that sets new constraints on a hypothesized particle, the extra Z-boson.Andrei Derevianko, an associate professor in the College of Science's Department of Physics, who has conducted groundbreaking research to improve the time-telling capabilities of the world's most accurate atomic clocks, is one of the principals behind what is believed to be the most accurate to-date low-energy determination of the strength of the electroweak coupling between atomic electrons and quarks of the nucleus. "It is remarkable that the low-cost atomic precision experiments and theory are capable of constraining new physics at the level competitive to colliders," Derevianko said. He has been able to define new limits without needing something like a $6 billion Large Hadron Collider, an enormous particle accelerator in Europe that is not yet fully operational. "This is like David and Goliath, we are just a small group of people able to better interpret the data on violation of mirror symmetry in atoms. Our work indicates less of a possibility for extra Z-bosons, potential carriers of the fifth force of nature...it is possible the LHC will be able either to move the mass limit higher or discover these particles," he said. Derevianko and his colleagues have determined the coupling strength by combining previous measurements made by Dr. Carl Wieman, a Nobel laureate in physics, with high-precision calculations in a cesium atom. The original work by Wieman on violation of mirror symmetry in atoms used a table-top apparatus at the University of Colorado in Boulder, Colo. The Boulder team monitored a "twinge" of weak force in atoms, which are otherwise governed by the electromagnetic force. The Standard Model of elementary particles, developed in the early 1970s, holds that heavy particles, called Z-bosons, carry this weak force. In contrast to the electromagnetic force, the weak force violates mirror symmetry: an atom and its mirror image behave differently. This is known to physicists as "parity violation." The Boulder group's experiment opened the door to new inquiry, according to Derevianko. "It pointed out a discrepancy, and hinted at a possibility for new physics, in particular, extra Z-bosons," he said. Interpretation of the Boulder experiment requires theoretical input. The analysis requires detailed understanding of the correlated motion of 55 electrons of cesium atom. This is not an easy task as the number of memory units required for storing full quantum-mechanical wavefunctions exceeds the estimated number of atoms in the Universe. Special computational tools and approximations were developed. Compared to previous analyses, reaching the next level of accuracy required a factor of 1,000 increase in computational complexity. The paper represents a dramatic improvement as researchers have struggled to develop a more precise test of the Standard Model. Derevianko's group, which included Dr. S. Porsev and a number of students, has worked on the analysis of the Boulder experiment for the past eight years. "Finally, the computer technology caught up with the number-crunching demands of the problem and we were able to attack the problem," says Derevianko. "I have greatly benefited from collaborations in this complex problem. A fellow co-author, Kyle Beloy, for example, has recently been recognized as an Outstanding Graduate Researcher by the University." In contrast to previous, less accurate interpretations of the Boulder experiment, Derevianko's group has found a perfect agreement with the prediction of the Standard Model. This agreement holds important implications for particle physics. "Atomic parity violation places powerful constraints on new physics beyond the Standard Model of elementary particles," Derevianko said. "With this new-found precision, we are doing a better job of 'listening' to the atoms."By refining and improving the computations, Derevianko said there is potential for a better understanding of hypothetical particles (extra Z-bosons) which could be carriers of a so-far elusive fifth force of nature. For years, physics researchers have grappled with experiments to prove or disprove the possibility of a fifth force of Nature.There are four known fundamental forces of Nature. In addition to gravity, electromagnetism creates light, radio waves and other forms of radiation. Two other forces operate only on an atomic level: These are the strong force, which binds particles in the nucleus, and the weak force, which reveals itself when atoms break down in radioactive decay, or as in the Boulder experiment, through the parity violation.The possibility of a fifth force could dispute the long-held belief that the force of gravity is the same for all substances."New physics beyond the Standard Model is the next frontier," Derevianko said, "and it's the theoretical motivation for much of this research." To read Derevianko's paper co-authored with S. Porsev and K. Beloy, go to: arxiv.org/abs/0902.0335Below is a summary of Derevianko's paper, which is entitled, " Precision determination of electroweak coupling from atomic parity violation and implications for particle physics": Atomic parity violation places powerful constraints on new physics beyond the Standard Model of elementary particles. The measurements are interpreted in terms of the nuclear weak charge, quantifying the strength of the electroweak coupling between atomic electrons and quarks of the nucleus. We report the most accurate to-date determination of this coupling strength by combining previous measurements by the Boulder group with our high-precision calculations in cesium atom. Our result is in a perfect agreement with the prediction of the Standard Model. In combination with the results of high-energy collider experiments, our work confirms the predicted energy dependence (or "running'') of the electroweak interaction over an energy range spanning four orders of magnitude (from ~10 MeV to ~100 GeV) and places new limits on the masses of extra Z bosons (Z'). Our raised bound on the Z' masses carves out a lower-energy part of the discovery reach of the Large Hadron Collider. At the same time, a major goal of the LHC is to find evidence for supersymmetry (SUSY), one of the basic, yet experimentally unproven, concepts of particle physics. Our result is consistent with the R-parity conserving SUSY with relatively light (sub-TeV) superpartners. This raises additional hopes of discovering SUSY at the LHC.
|
|
|
NEWS
May 1, 2009 5:51:05 GMT 4
Post by towhom on May 1, 2009 5:51:05 GMT 4
Sandia researchers construct carbon nanotube device that can detect colors of the rainbowSandia National Laboratories FOR IMMEDIATE RELEASE: April 30, 2009www.sandia.gov/news/resources/releases/2009/nano_detect.htmlLIVERMORE, CA — Researchers at Sandia National Laboratories have created the first carbon nanotube device that can detect the entire visible spectrum of light, a feat that could soon allow scientists to probe single molecule transformations, study how those molecules respond to light, observe how the molecules change shapes, and understand other fundamental interactions between molecules and nanotubes. Carbon nanotubes are long thin cylinders composed entirely of carbon atoms. While their diameters are in the nanometer range (1-10), they can be very long, up to centimeters in length. The carbon-carbon bond is very strong, making carbon nanotubes very robust and resistant to any kind of deformation. To construct a nanoscale color detector, Sandia researchers took inspiration from the human eye, and in a sense, improved on the model. When light strikes the retina, it initiates a cascade of chemical and electrical impulses that ultimately trigger nerve impulses. In the nanoscale color detector, light strikes a chromophore and causes a conformational change in the molecule, which in turn causes a threshold shift on a transistor made from a single-walled carbon nanotube. “In our eyes the neuron is in front of the retinal molecule, so the light has to transmit through the neuron to hit the molecule,” says Sandia researcher Xinjian Zhou. “We placed the nanotube transistor behind the molecule—a more efficient design.” Zhou and his Sandia colleagues François Léonard, Andy Vance, Karen Krafcik, Tom Zifer, and Bryan Wong created the device. The team recently published a paper, “ Color Detection Using Chromophore-Nanotube Hybrid Devices,” in the journal Nano Letters. The idea of carbon nanotubes being light sensitive has been around for a long time, but earlier efforts using an individual nanotube were only able to detect light in narrow wavelength ranges at laser intensities. The Sandia team found that their nanodetector was orders of magnitude more sensitive, down to about 40 W/m2—about 3 percent of the density of sunshine reaching the ground. “Because the dye is so close to the nanotube, a little change turns into a big signal on the device,” says Zhou. The research is in its second year of internal Sandia funding and is based on Léonard’s collaboration with the University of Wisconsin to explain the theoretical mechanism of carbon nanotube light detection. Léonard literally wrote the book on carbon nanotubes—The Physics of Carbon Nanotubes, published September 2008. Léonard says the project draws upon Sandia’s expertise in both materials physics and materials chemistry. He and Wong laid the groundwork with their theoretical research, with Wong completing the first-principles calculations that supported the hypothesis of how the chromophores were arranged on the nanotubes and how the chromophore isomerizations affected electronic properties of the devices. To construct the device, Zhou and Krafcik first had to create a tiny transistor made from a single carbon nanotube. They deposited carbon nanotubes on a silicon wafer and then used photolithography to define electrical patterns to make contacts. The final piece came from Vance and Zifer, who synthesized molecules to create three types of chromophores that respond to either the red, green, or orange bands of the visible spectrum. Zhou immersed the wafer in the dye solution and waited a few minutes while the chromophores attached themselves to the nanotubes. The team reached their goal of detecting visible light faster than they expected—they thought the entire first year of the project would be spent testing UV light. Now, they are looking to increase the efficiency by creating a device with multiple nanotubes. “Detection is now limited to about 3 percent of sunlight, which isn’t bad compared with a commercially available digital camera,” says Zhou. “I hope to add some antennas to increase light absorption.” A device made with multiple carbon nanotubes would be easier to construct and the resulting larger area would be more sensitive to light. A larger size is also more practical for applications. Now, they are setting their sites on detecting infrared light. “We think this principle can be applied to infrared light and there is a lot of interest in infrared detection,” says Vance. “So we’re in the process of looking for dyes that work in infrared.” This research eventually could be used for a number of exciting applications, such as an optical detector with nanometer scale resolution, ultra-tiny digital cameras, solar cells with more light absorption capability, or even genome sequencing. The near-term purpose, however, is basic science. “A large part of why we are doing this is not to invent a photo detector, but to understand the processes involved in controlling carbon nanotube devices,” says Léonard. The next step in the project is to create a nanometer-scale photovoltaic device. Such a device on a larger scale could be used as an unpowered photo detector or for solar energy. “Instead of monitoring current changes, we’d actually generate current,” says Vance. “We have an idea of how to do it, but it will be a more challenging fabrication process.” Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin company, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.Download graphic imagery 300dpi 4.27MB JPEG here.
|
|
|
NEWS
May 1, 2009 6:59:47 GMT 4
Post by satchmo on May 1, 2009 6:59:47 GMT 4
Congressman Paul on the Recent Swine Flu Scare
added May 1 2009 Oops....I see you've already posted this Nina....thanks......I'll leave it up as it is very informative.
Looks like the media is hyping the situation.......and the pharma companies shares are going up in value as a result.
As Ron Paul said about the 70s flu scare...more people were adversly effected by the vaccination than by the flu itself (flu...1 death vaccinations...20 deaths)!!!
satchmo
|
|
|
NEWS
May 1, 2009 11:53:37 GMT 4
Post by ninathedog on May 1, 2009 11:53:37 GMT 4
Invisibility cloak edges closer By Victoria Gill Science reporter, BBC Newsnews.bbc.co.uk/2/hi/science/nature/8025886.stmScientists have rendered objects invisible under near-infrared light.Unlike previous such "cloaks", the new work does not employ metals, which introduce losses of light and result in imperfect cloaking. Because the approach can be scaled down further in size, researchers say this is a major step towards a cloak that would work for visible light. One of the research teams describes its miniature "carpet cloak" in the journal Nature Materials. This "carpet" design was based on a theory first described by John Pendry, from Imperial College London, in 2008. Michal Lipson and her team at Cornell University demonstrated a cloak based on the concept. Xiang Zhang, professor of mechanical engineering at the University of California, Berkeley, led the other team. "Essentially, we are transforming a straight line of light into a curved line around the cloak, so you don't perceive any change in its pathway," he explained. This is not the first time an invisibility cloak has been made, but previous designs have used metals, whereas the carpet cloak is built using a dielectric - or insulating material - which absorbs far less light. "Metals introduce a lot of loss, or reduce the light intensity," said Professor Zhang. This loss can leave a darkened spot in the place of the cloaked object. So using silicon, a material that absorbs very little light, is a "big step forward," he says. Transforming lightThe cloak's design cancels out the distortion produced by the bulge of the object underneath, bending light around it - like water around a rock - and giving the illusion of a flattened surface. Professor Zhang explained that the cloak "changes the local density" of the object it is covering. "When light passes from air into water it will be bent, because the optical density, or refraction index, of the glass is different to air," he told BBC News. "So by manipulating the optical density of an object, you can transform the light path from a straight line to to any path you want." The new material does this via a series of minuscule holes - which are strategically "drilled" into a sheet of silicon. Proving Professor Pendry's theory, Professor Zhang's team was able to "decide the profile" of the cloaked object - altering the optical density with the holes. "In some areas we drill lots of very densely packed holes, and in others they are much sparser. Where the holes are more dense, there is more air than silicon, so the optical density of the object is reduced," Professor Zhang explained. "Each hole is much smaller than the wavelength of the light. So optical light doesn't see a hole - it just sees a sort of air-silicon mixture. So as far as the light is concerned, we have adjusted the density of the object." He pointed out that his demonstration cloak is very tiny - just a few thousandths of a millimetre across. But there are applications even for a cloak of this size. Such a device could be used, for example, in the electronics industry, to hide flaws on the intricate stencils or 'masks' that are used to cast processor chips. "This could save the industry millions of dollars," he said. "It would allow them to fix flaws rather than produce an entirely new mask."
|
|