|
Post by fr33ksh0w2012 on Sept 3, 2009 7:10:36 GMT 4
News Ping-Pong
It is amazing to realize what the media considers are not only the "issues" that we face globally, but the "general (manipulated) population perspective" that goes along with them.
Money talks and bullsh*t is just that - bullsh*t.
Yes, I know I'm being crude here - but frankly so is the media. We're dealing with a "group-mind zombie mentality" that prints or speaks on cue, no thought to the effects of half-truths, out right lies, word-play bylines, tele-prompter techno-babble, etc.
Last week: "Vaccines and the Return of School Days"...
Two weeks ago: "The Hill and Health Care - What you will never know...but should"
Three weeks ago: "The Federal Reserve Bank - We profit from your loss"
Four weeks ago: "Who's Minding the Oil Glut?"
Needless to say - it's a broken record. Each topic is brought back out with "new" slants and "anonymously-sourced" leaks to renew that "fuzzy feeling among the majority" and we aren't talking about "The Love Machine" here - it's "mind-numbing tactics".
You know what, goobs? You should all be ashamed of yourselves. The problem is you're too busy trying to maintain some semblance of control to make any sense.
The only thing that keeps me from blasting you in print is the knowledge that hot air rises - and you're certainly full of that. I hope that the sheer volume of "the duality of release" (aka oral and anal) can be offset by climate change mitigation currently under research.
Whatever...
your favorite pest SALLY ANNE, I BELEIVE THE TERM YOU ARE REFERING TO IS MASTER MANIPULATIVE MIND F***! THAT IS ALL THE REASON I STOPPED BUYING INTO THEIR LAME A** SH*T! I HAVE LEARNT FAR MORE STAYING AT HOME THEN GOING TO THE ILLUMAROT SCHOOLS AND HIGH SCHOOL'S (NO WE DON'T HAVE MIDDLE SCHOOL IN AUSTRALIA YOU CAN IMAGINE THE AMOUNT OF VOILENT SH*T THAT CAUSES!) YEAH BUT IT'S ALL THE MORE WHY I DON'T WATCH "TELLY" ANYMORE YOU GOT YOUR: *NEWS CR&P *THOSE "DISGUSTING" REALITY SHOWS *SPORTS (OH HOW MUNDANE I HATE DAAAAA FOOOOOTEEEE!!) *THOSE "LAME" SOAPIES *AND DON'T GET ME STARTED WITH THE TRIPE THEY CALL GAME-SHOWS THESE DAYS *ALSO THE STUPID CARTOONS THEY HAVE THESE DAYS WHERE ARE THE "MORALS OF THE STORY" (WHERE ARE TINY-TOONS WHEN YOU NEED THEM SCARY MOVIES HITCHC.OCK WOULD BE FLIPPING IN HIS GRAVE IF HE SAW THE TRIPE THEY CALL "HORROR MOVIES" THESE DAYS GIMME STEPHEN KING'S THE STAND AND THE SHINING ANY DAY OUT OF THE SUPPOSSED SCARY "HORROR"MOVIES OF TODAY!!) NO WE GET THIS DUMBED DOWN TRIPE
|
|
|
Post by towhom on Sept 3, 2009 7:23:37 GMT 4
Map Characterizes Active Lakes Below Antarctic IceScienceDaily Sep. 2, 2009www.sciencedaily.com/releases/2009/09/090901150949.htm Dots represent the locations where scientists have identified 124 active lakes below the ice sheet in Antarctica. Warmer colors (orange and red) depict lakes with larger water volumes while cooler colors (green and blue) depict lakes with smaller volumes. Purple areas show the locations of previously known inactive lakes. (Credit: Ben Smith, University of Washington)
Lakes in Antarctica, concealed under miles of ice, require scientists to come up with creative ways to identify and analyze these hidden features. Now, researchers using space-based lasers on a NASA satellite have created the most comprehensive inventory of lakes that actively drain or fill under Antarctica's ice. They have revealed a continental plumbing system that is more dynamic than scientists thought.
"Even though Antarctica's ice sheet looks static, the more we watch it, the more we see there is activity going on there all the time," said Benjamin Smith of the University of Washington in Seattle, who led the study.
Unlike most lakes, Antarctic lakes are under pressure from the ice above. That pressure can push melt water from place to place like water in a squeezed balloon. The water moves under the ice in a broad, thin layer, but also through a linked cavity system. This flow can resupply other lakes near and far.
Understanding this plumbing is important, as it can lubricate glacier flow and send the ice speeding toward the ocean, where it can melt and contribute to sea level change. But figuring out what's happening beneath miles of ice is a challenge.
Researchers led by Smith analyzed 4.5 years of ice elevation data from NASA's Ice, Cloud and land Elevation satellite (ICESat) to create the most complete inventory to date of changes in the Antarctic plumbing system. The team has mapped the location of 124 active lakes, estimated how fast they drain or fill, and described the implications for lake and ice-sheet dynamics in the Journal of Glaciology.
What Lies Beneath
For decades, researchers flew ice-penetrating radar on airplanes to "see" below the ice and infer the presence of lakes. In the 1990s, researchers began to combine airborne- and satellite-based data to observe lake locations on a continent-wide scale.
Scientists have since established the existence of about 280 "subglacial" lakes, most located below the East Antarctic ice sheet. But those measurements were a snapshot in time, and the question remained as to whether lakes are static or dynamic features. Were they simply sitting there collecting water?
In 2006 Helen Fricker, a geophysicist at the Scripps Institution of Oceanography, La Jolla, CA, used satellite data to first observe subglacial lakes on the move. Working on a project to map the outline of Antarctica's land mass, Fricker needed to differentiate floating ice from grounded ice. This time it was laser technology that was up to the task. Fricker used ICESat's Geoscience Laser Altimeter System and measured how long it took a pulse of laser light to bounce of the ice and return to the satellite, from which she could infer ice elevation. Repeating the measurement over a course of time revealed elevation changes.
Fricker noticed, however, a sudden dramatic elevation change -- over land. It turned out this elevation change was caused by the filling and draining of some of Antarctica's biggest lakes.
"Sub-ice-sheet hydrology is a whole new field that opened up through the discovery of lakes filling and draining on relatively short timescales and involving large volumes of water," said Robert Bindschadler, a glaciologist at NASA's Goddard Space Flight Center in Greenbelt, MD, who has used ICESat data in other studies of Antarctica. "ICESat gets the credit for enabling that discovery."
Networking in the Antarctic
But were active lakes under the ice a common occurrence or a fluke?
To find out, Ben Smith, Fricker and colleagues extended their elevation analysis to cover most of the Antarctic continent and 4.5 years of data from ICESat's Geoscience Laser Altimeter System (GLAS). By observing how ice sheet elevation changed between the two or three times the satellite flew over a section every year, researchers could determine which lakes were active. They also used the elevation changes and the properties of water and ice to estimate the volume change.
Only a few of the more than 200 previously identified lakes were confirmed active, implying that lakes in East Antarctica's high-density "Lakes District" are mostly inactive and do not contribute much to ice sheet changes.
Most of the 124 newly observed active lakes turned up in coastal areas, at the head of large drainage systems, which have the largest potential to contribute to sea level change.
"The survey identified quite a few more subglacial lakes, but the locations are the intriguing part," Bindschadler said. "The survey shows that most active subglacial lakes are located where the ice is moving fast, which implies a relationship."
Connections between lakes are apparent when one lake drains and another simultaneously fills. Some lakes were found to be connected to nearby lakes, likely through a network of subglacial tunnels. Others appeared to be linked to lakes hundreds of miles away.
The team found that the rate at which lake water drains and fills varies widely. Some lakes drained or filled for periods of three to four years in steady, rather than episodic, changes. But water flow rates beneath the ice sheet can also be as fast as small rivers and can rapidly supply a lubricating film beneath fast-flowing glaciers.
"Most places we looked show something happening on short timescales," Smith said. "It turns out that those are fairly typical examples of things that go on under the ice sheet and are happening all the time all over Antarctica."
Adapted from materials provided by NASA/Goddard Space Flight Center.
|
|
|
Post by towhom on Sept 3, 2009 7:33:23 GMT 4
Some Discrepancies Exist Between Outcomes Indicated In Trial Registration And Later PublicationsScienceDaily Sep. 2, 2009www.sciencedaily.com/releases/2009/09/090901163918.htmComparison of the primary outcomes of registered clinical trials with their subsequent publication appears to show some discrepancies, according to a study in the September 2 issue of JAMA.In 2005, the International Committee of Medical Journal Editors (ICMJE) adopted a policy requiring researchers to deposit information about randomized controlled trials into a clinical trials registry before study participants enrolled as a precondition for publication of the study's findings in member journals. "One of the main objectives of trial registration is to help achieve transparency in results and make information about the existence and design of clinical trials publicly available," the authors provide as background information. "This policy should permit knowledge sharing about the key elements of clinical trials and help decrease the risk of selective reporting of outcomes that was previously identified in published results of RCTs [randomized controlled trials]."Sylvain Mathieu, M.D., of Hopital Bichat-Claude Bernard, Paris and colleagues conducted a search of the MEDLINE via PubMed to identify randomized controlled trials in three areas: cardiology, rheumatology, and gastroenterology, that were indexed in 2008 in the 10 general medical journals and specialty medical journals with the highest impact factors. The researchers sought to compare the primary outcomes specified in trial registries with those reported in the published articles and to determine whether outcome reporting bias favored significant primary outcomes. Of the 323 included articles, 114 (35.3 percent) were published in general medical journals and 209 (64.7 percent) in specialty journals. "A total of 147 trials (45.5 percent) were adequately registered (i.e., registered before the end of the trial, with the primary outcome clearly specified)," the authors write. "Trial registration was lacking for 89 published reports (27.6 percent), 45 trials (13.9 percent) were registered after the completion of the study, 39 (12.1 percent) were registered with no or an unclear description of the primary outcome, and 3 (0.9 percent) were registered after the completion of the study and had an unclear description of the primary outcome." The authors note that the proportion of registered trials was greater for the general medical journals than the specialty publications. "Among articles with trials adequately registered, 31 percent (46 of 147) showed some evidence of discrepancies between the outcomes registered and the outcomes published." Of those 46 articles, the authors report "19 of 23 (82.6 percent) had a discrepancy that favored statistically significant results (i.e., a new, statistically significant primary outcome was introduced in the published article or a nonsignificant primary outcome was omitted or not defined as the primary outcome in the published article).""Trial registration provides a good opportunity for editors, peer-reviewers, and policy makers to identify outcome reporting bias, and other deviations from the planned study to prevent such distortions from reaching publication," the authors write. "In conclusion, although trial registration is now the rule, careful implementation of trial registration, with full involvement of authors, editors, and reviewers is necessary to ensure publication of quality, unbiased results." Journal reference: Sylvain Mathieu; Isabelle Boutron; David Moher; Douglas G. Altman; Philippe Ravaud. Comparison of Registered and Published Primary Outcomes in Randomized Controlled Trials. JAMA, 2009; 302 (9): 977-984 [link] Adapted from materials provided by JAMA and Archives Journals.
|
|
|
Post by towhom on Sept 3, 2009 8:14:54 GMT 4
H1N1 Pandemic Virus Does Not Mutate Into 'Superbug' In Lab StudyScienceDaily Sep. 1, 2009www.sciencedaily.com/releases/2009/09/090901091731.htmA laboratory study by University of Maryland researchers suggests that some of the worst fears about a virulent H1N1 pandemic flu season may not be realized this year, but does demonstrate the heightened communicability of the virus. Using ferrets exposed to three different viruses, the Maryland researchers found no evidence that the H1N1 pandemic variety, responsible for the so-called swine flu, combines in a lab setting with other flu strains to form a more virulent 'superbug.' Rather, the pandemic virus prevailed and out-competed the other strains, reproducing in the ferrets, on average, twice as much. The researchers believe their study is the first to examine how the pandemic virus interacts with other flu viruses. The findings are newly published in an online scientific journal designed to fast-track science research and quickly share results with other investigators, PLoS Currents. "The H1N1 pandemic virus has a clear biological advantage over the two main seasonal flu strains and all the makings of a virus fully adapted to humans," says virologist Daniel Perez, the lead researcher and program director of the University of Maryland-based Prevention and Control of Avian Influenza Coordinated Agricultural Project. "I'm not surprised to find that the pandemic virus is more infectious, simply because it's new, so hosts haven't had a chance to build immunity yet. Meanwhile, the older strains encounter resistance from hosts' immunity to them," Perez adds. Some of the animals who were infected with both the new virus and one of the more familiar seasonal viruses (H3N2) developed not only respiratory symptoms, but intestinal illness as well. Perez and his team call for additional research to see whether this kind of co-infection and multiple symptoms may account for some of the deaths attributed to the new virus. Among other research findings, the pandemic virus successfully established infections deeper in the ferret's respiratory system, including the lungs. The H1 and H3 seasonal viruses remained in the nasal passages. "Our findings underscore the need for vaccinating against the pandemic flu virus this season," Perez concludes. "The findings of this study are preliminary, but the far greater communicability of the pandemic virus serves as a clearly blinking warning light."
Perez and his team used samples of the H1N1 pandemic variety from last April's initial outbreak of the so-called swine flu.The research is funded by the National Institute of Allergy and Infectious Diseases, part of the National Institutes of Health. Adapted from materials provided by University of Maryland.
|
|
|
Post by towhom on Sept 3, 2009 8:22:14 GMT 4
Old Moon Discovery Helps Unlock Earth Ocean SecretsScienceDaily Sep. 2, 2009www.sciencedaily.com/releases/2009/08/090831130145.htmA discovery about the moon made in the 1960s is helping researchers unlock secrets about Earth's oceans today. By applying a method of calculating gravity that was first developed for the moon to data from NASA's Gravity Recovery and Climate Experiment, known as Grace, JPL researchers have found a way to measure the pressure at the bottom of the ocean. Just as knowing atmospheric pressure allows meteorologists to predict winds and weather patterns, measurements of ocean bottom pressure provide oceanographers with fundamental information about currents and global ocean circulation. They also hold clues to questions about sea level and climate. "Oceanographers have been measuring ocean bottom pressure for a long time, but the measurements have been limited to a few spots in a huge ocean for short periods of time," says JPL oceanographer Victor Zlotnicki. Launched in 2002, the twin Grace satellites map Earth's gravity field from orbit 500 kilometers (310 miles) above the surface. They respond to how mass is distributed in the Earth and on Earth's surface - the greater the mass in a given area, the stronger the pull of gravity from that area. The pressure at the bottom of the ocean is determined by the amount of mass above it. "Ocean bottom pressure is the sum of the weight of the whole atmosphere and the whole ocean," says Zlotnicki. "When winds move water on the surface, ocean bottom pressure changes. When glaciers melt and add water to the ocean, the ocean's mass increases and bottom pressure increases, either at one place or globally." "Measuring ocean bottom pressure was one of the things we said we wanted to do from the very beginning of the mission," says Grace project scientist Michael Watkins, "but it has been a challenge. The signal is very small and hard to detect." Gravity changes over the ocean are miniscule compared to those over land. The ocean is a fluid. It yields to pressure and spreads the effect over a vast area. Nothing in the ocean gives as big a gravity signal as a flooding Amazon River or melting glaciers in Greenland or Alaska, changes that Grace can measure fairly easily, says Watkins. "Those hydrology signals are huge in comparison," he says. However, as the mission progressed, Watkins explains, the science team has found better ways to process Grace data. And by turning to a technique developed for the lunar world, Grace researchers are getting the precise measurements of ocean bottom pressure they were hoping for. From the moon to the ocean bottomIn the days leading up to the Apollo missions, JPL scientists discovered that certain areas of the moon had higher concentrations of mass than others. The result of these "mass concentrations" was marked differences in the moon's gravity field. The researchers then devised a new way to calculate the gravity field called a "mascon" (for mass concentration) solution. Mascon solutions break the gravity field into small, individual regions. The more traditional ways of computing gravity, often called harmonic solutions, smooth everything together and calculate gravity for a whole large area or body. Recently scientists have begun developing mascon solutions for Grace data for use in a variety of studies, and they are revealing fascinating new details about Earth's gravity field. These mascon solutions are also proving to be a key to Grace's ability to measure ocean bottom pressure. "Some of the very best harmonic solutions show some bottom pressure signals, but the mascon solutions appear to do a better job and provide much higher resolution," says Watkins. "Using a mascon solution with Grace data is a way of weighing each little piece of the ocean," he says. The result is a new view of the gravity field - one that reveals sharp contrasts in gravity precise enough to calculate variations in ocean bottom pressure. A large field experiment off the coast of Japan provided an unusual and welcomed opportunity to put Grace mascon estimates of ocean bottom pressure to the test. There are few places in the ocean where there are enough data on ocean bottom pressure to validate the satellite's observations. Oceanographer Jae-Hun Park and his colleagues at the University of Rhode Island compared the Grace measurements with data collected by a large array of pressure-reading instruments stationed on the ocean bottom as part of the Kuroshio Extension System Study. This two-year observational program to study deep ocean currents and fronts ran from 2004 to 2006. "Our site covered a very wide area of 600 by 600 kilometers (370 miles) with 43 available bottom pressure sensors," says Park. He and his colleagues found that while some of the individual sensors had very high correlations with Grace measurements, others were very low. "These low correlations were small-scale eddies that Grace cannot catch," explains Park. Grace's resolution is about 200 kilometers (125 miles). However, when they compared the spatially averaged monthly mean ocean bottom pressure measured by the ocean sensors with the latest JPL Grace mascon solution for the center of the array, "we found a high correlation between the Grace measurements and our in-situ measurements," says Park. "This experiment gave us the opportunity to validate the Grace data." The results of the study appeared last year in Geophysical Research Letters. Grace's new ability to detect small changes in ocean mass - reflected in ocean bottom pressure - will help scientists answer ongoing questions about sea level and climate change. It will help clarify, for example, just how much of sea level change is due to differences in ocean mass, the result of evaporation, precipitation, melting land ice, or river run-off and how much is due to temperature and salinity."Now, for the first time with these new mascon solutions," say Zlotnicki, "Grace will allow us to measure changes in ocean bottom pressure globally for long periods of time. This is a new tool for oceanography." Adapted from materials provided by NASA.
|
|
|
Post by emeraldsun on Sept 3, 2009 20:18:27 GMT 4
Quite an interesting interview. He seems quite knowledgeable and sincere in his statements. Some of his memories are very much like some of mine, although I can not distinquise which where real or dreamed. My Father was Navy too and then FAA til retirement. As always listen with your own well tuned discernment.
A Conversation with Andrew D. Basiago about the Hidden History of His Discovery of Life on Mars.
|
|
|
Post by towhom on Sept 4, 2009 2:23:13 GMT 4
Argonne researchers develop method that aims to stabilize antibodiesTechnique could lead to improved detection, diagnostics capabilitiesEurekAlert Public Release: 3-Sep-2009www.eurekalert.org/pub_releases/2009-09/dnl-ard090309.phpARGONNE, IL -- Researchers at the U.S. Department of Energy's Argonne National Laboratory have developed a systematic method to improve the stability of antibodies. The technique could lead to better biosensors, disease therapeutics and diagnostic reagents and non-laboratory applications, including environmental remediation. Antibodies are proteins produced by humans and animals to defend against infections; they are also used to diagnose and treat some diseases and detect toxins and pathogens. "The primary issues with antibodies is that they are fragile and short-lived outside of cooler temperature-controlled environments, making their usefulness usually limited to laboratory applications," said Argonne senior biophysicist Fred Stevens, the project's principle investigator. Specifically, "stabilized antibodies, with full functionality, could be used in diagnostic and detection kits that can survive in less than optimal environments and be stockpiled for years at a time," Stevens said. "They could be used to combat diseases like cancer. They can also be used as the basis for biosensors that can continuously detect for pathogens like botulinum, ricin and anthrax in places such as airports and subway stations – locations where it is not currently possible to provide ongoing detection of pathogens because antibodies cannot tolerate the environmental conditions." Argonne has provided funding toward Stevens' research. Earlier research funded by the National Institutes of Health showed that it was possible to stabilize antibodies after a team led by Stevens unexpectedly discovered that natural antibodies contain stabilizing amino acid replacements. Antibodies are made up of four polypeptides -- two light chains and two heavy chains. These chains are made up of modules known as constant and variable domains. The light and heavy chain each has a variable domain, which come together to form the antigen binding site. Because of the great diversity of amino acids in the variable domains, different antibodies are capable of interacting with an effectively unlimited number of targets.
Sometimes this variability comes at a price; the amyloid-forming light chains were less stable than their normal counterparts. However, even amyloid-forming light chains have amino acid substitutions that improve stability. When seven of these amino acid changes were introduced into an amyloid-forming variable domain, a billion-fold improvement in thermodynamic stability was obtained reflecting a much higher ratio of native protein folds to unfolded proteins – a major determinate of antibody shelf life."Our work at this detailed level has taught us that antibody stabilization was possible, but we needed to find out if antibodies could be stabilized without compromising their function and do so with moderate experimental investment," Stevens said. Recent work suggests these goals are potentially achievable. To proactively improve the stability of a different antibody variable domain, Argonne researchers drew up a short list of 11 candidate amino acid changes. Four of the amino acid changes improved antibody stability and when combined together in the original domain provided a 2,000-fold improvement in stability.
A follow up experiment using a functional antibody fragment was able to improve antibody stability comparably, with no loss of antibody functionality. Both experiments required approximately one month to accomplish instead of the potentially open-ended time required for most protein stabilization projects.
There is a correlation between thermodynamic stability and thermal stability, the billion fold improvement in thermodynamic stability increased the thermal resistance of the protein to heating, resulting in a "melting temperature" of about 160 degrees Fahrenheit. "However, still unanswered is whether it is possible to be confident about improving the stability of any antibody generated against a particular target," Stevens said. "Our research indicates that stabilization of antibodies is possible. We project that it could be possible to generate the data to guide stabilization of every future antibody in the near future."Argonne's Office of Technology Transfer is actively seeking participation from industry for licensing as well as funding for further development of this technology.
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.
|
|
|
Post by towhom on Sept 4, 2009 3:17:28 GMT 4
Indoor plants found to release volatile organic compoundsStudy indicates need for further research to determine environmental, health impactsEurekAlert Public Release: 3-Sep-2009www.eurekalert.org/pub_releases/2009-09/asfh-ipf090309.phpATHENS, GA — Potted plants add a certain aesthetic value to homes and offices, bringing a touch of nature to indoor spaces. It has also been shown that many common house plants have the ability to remove volatile organic compounds—gases or vapors emitted by solids and liquids that may have adverse short- and long-term health effects on humans and animals—from indoor air. But take heed when considering adding some green to your environment; in addition to giving off healthy oxygen and sucking out harmful VOCs, a new study shows that some indoor plants actually release volatile organic compounds into the environment. A research team headed by Stanley J. Kays of the University of Georgia's Department of Horticulture conducted a study to identify and measure the amounts of volatile organic compounds (VOCs) emitted by four popular indoor potted plant species. The study, published in the American Society for Horticultural Science journal HortScience, also noted the source of VOCs and differences in emission rates between day and night. The four plants used in the study were Peace Lily ( Spathiphyllum wallisii Regel), Snake Plant ( Sansevieria trifasciata Prain), Weeping Fig ( Ficus benjamina L.), and Areca Palm ( Chrysalidocarpus lutescens Wendl.). Samples of each plant were placed in glass containers with inlet ports connected to charcoal filters to supply purified air and outlet ports connected to traps where volatile emissions were measured. The results were compared to empty containers to verify the absence of contaminants. A total of 23 volatile compounds were found in Peace Lily, 16 in Areca Palm, 13 in Weeping Fig, and 12 in Snake Plant. Some of the VOCs are ingredients in pesticides applied to several species during the production phase.Other VOCs released did not come from the plant itself, but rather the micro-organisms living in the soil. "Although micro-organisms in the media have been shown to be important in the removal of volatile air pollutants, they also release volatiles into the atmosphere", Kays stated. Furthermore, 11 of the VOCs came from the plastic pots containing the plants. Several of these VOCs are known to negatively affect animals. Interestingly, VOC emission rates were higher during the day than at night in all of the species, and all classes of emissions were higher in the day than at night. The presence of light along with many other factors effect synthesis, which determines the rate of release.The study concluded that "while ornamental plants are known to remove certain VOCs, they also emit a variety of VOCs, some of which are known to be biologically active. The longevity of these compounds has not been adequately studied, and the impact of these compounds on humans is unknown." The complete study and abstract are available on the ASHS Hortscience electronic journal web site: hortsci.ashspublications.org/cgi/content/abstract/44/2/396Founded in 1903, the American Society for Horticultural Science (ASHS) is the largest organization dedicated to advancing all facets of horticultural research, education, and application. More information at ashs.org.
|
|
|
Post by towhom on Sept 4, 2009 3:25:55 GMT 4
How to advance scientific literacy3 upcoming articles look into the ways plant biologists can improve science communication with students and the general publicEurekAlert Public Release: 3-Sep-2009www.eurekalert.org/pub_releases/2009-09/ajob-hta090309.phpSociety needs science, and scientists need an informed, thoughtful, and open-minded citizenry. Thus, the obvious dependence of American society on science is strikingly inconsistent with the low level of scientific literacy among U. S. citizens. By establishing 2009 as the "Year of Science," professional scientific organizations and grassroots, citizens-for-science groups hope to bring a renewed and invigorated focus on the importance of science now and in the future. As knowledge experts and educators, practicing scientists are key players in advancing the scientific literacy agenda. As part of its 2008 annual meeting, the Botanical Society of America (BSA) organized a symposium to help inform attendees about the issues involved in scientific literacy as well as the progress achieved toward the goal of obtaining a public that is better informed and more accepting of scientific achievements and science in general. There were five presentations during the symposium: Marshall Sundberg discussed the PlantingScience initiative developed by the BSA ( www.PlantingScience.org), Gordon Uno showed how developing botanical literacy among our students can contribute to scientific literacy, Judith Scotchmoor illustrated how she and her colleagues have developed educational outreach and resources for helping teachers teach the process of science to their students, and Matthew Nisbet and Dietram Scheufele each discussed different aspects of science communication and the public. Papers based on these presentations will be published in the October issue of the American Journal of Botany and will remain free for viewing. All of the papers—including the introduction by Christopher Haufler and Marshall Sundberg ( www.amjbot.org/cgi/reprint/ajb.0900241v1)—show how both passive and active forces have contributed to current concerns about scientific literacy. In his contribution, Gordon Uno ( www.amjbot.org/cgi/reprint/ajb.0900025v1) summarizes why it is important for scientists in general and botanists in particular to invigorate science teaching with inquiry methods. He illustrates the challenges we face because students lack critical thinking skills, are generally uninformed about plants, and many are actually hostile toward learning about plant biology. To improve this situation, Uno provides seven principles of learning that make recommendations about how botanists should teach, including using themes and "thinking botanically" to illustrate all biological concepts. Judith Scotchmoor and her colleagues Anastasia Thanukos and Sheri Potter (online soon at www.amjbot.org/papbyrecent.dtl) discuss efforts targeted at raising public awareness of science (via COPUS, the Coalition on the Public Understanding of Science) and provide resources that are available to teachers who seek to weave the "process of science" into courses to inform students about how science works. By developing a public that is more actively aware of science as part of their lives, both citizens in general and students in particular are more likely to be interested in learning about science. Scotchmoor et al. also discuss the web-based project called "Understanding Science" that aims to improve teacher understanding of the nature of the scientific enterprise, provide resources that encourage and enable K-16 teachers to reinforce the nature of science throughout their teaching, and serve as a clear and accessible reference that accurately portrays the scientific endeavor. Matthew Nisbet and Dietram Scheufele ( www.amjbot.org/cgi/reprint/ajb.0900041v1) melded their presentations into a joint-authored paper to discuss efforts targeted at raising public awareness of science. As researchers into communication about science, these authors illustrate that building a public that is more receptive to science requires more than enhancing scientific literacy. They emphasize that science communication efforts need to be based on a systematic, empirical understanding of the intended audience's existing values, knowledge, and attitudes, their interpersonal and social contexts, and their preferred media sources and communication channels. Taken together, this set of papers captures current issues about the public understanding of science, illustrates why greater emphasis on helping students understand and appreciate the process of science is so important, and provides insights and perspectives on what all practicing scientists can do to build a more receptive audience. It appears that in some respects academic scientists are contributing to the problem because we tend to teach content (facts about biology) rather than process (how to learn about biology). We need to help our students understand how scientists actually do our work, and we should learn about the social dynamics involved with scientific communication. Each of the papers presents different elements of making us more aware of the challenges we face, better prepared to help our students appreciate and learn about science, and in general enhancing our capacity to change the future. Practicing scientists should be active participants in making sure that scientific literacy improves for new generations of students. The full articles in the links mentioned are available for no charge for 30 days following the date of this summary at www.amjbot.org/papbyrecent.dtl. After this date, reporters may contact Richard Hund at ajb@botany.org for a copy of the article. The Botanical Society of America (www.botany.org) is a non-profit membership society with a mission to promote botany, the field of basic science dealing with the study and inquiry into the form, function, development, diversity, reproduction, evolution, and uses of plants and their interactions within the biosphere. It has published the American Journal of Botany (www.amjbot.org) for nearly 100 years. In 2009, the Special Libraries Association named the American Journal of Botany one of the Top 10 Most Influential Journals of the Century in the field of Biology and Medicine. For further information, please contact the AJB staff at ajb@botany.org.
|
|
|
Post by towhom on Sept 4, 2009 4:18:41 GMT 4
Species diversity helps ASU researchers refine analyses of human gene mutationsArizona State University Biodesign Institute September 03, 2009www.biodesign.asu.edu/news/species-diversity-helps-asu-researchers-refine-analyses-of-human-gene-mutationsIn the new era of personalized medicine, physicians hope to provide earlier diagnoses and improve therapy by evaluating patients’ genetic blueprints. But, as a new bioinformatics study emphasizes, the first step must be to correctly decipher the deluge of information locked in our DNA and determine its impact on human health. In the September issue of Genome Research, Dr. Sudhir Kumar led a team of researchers at the Biodesign Institute at Arizona State University in examining DNA mutations from both healthy and diseased patients. Their work evaluates the reliability of computer models aimed at predicting the eventual effect of such mutations.Along with Kumar, director of the Biodesign Institute’s Center for Evolutionary Functional Genomics, others involved with the study were co-authors Michael P. Suleski, Glenn J. Markov, Simon Lawrence, Antonio Marco and Alan J. Filipski. Kumar’s team focused on single DNA mutations—changes to a person’s genome that can sometimes make the difference between robust health and debilitating illness. The current study focused on one specific type of DNA mutation—a single change at a given location along the length of DNA—that alters the resulting protein. These protein changes are the source of much of our individuality, coding for differences such as eye and hair color. Scientists have discovered that each person’s genome contains thousands of such protein changes. Other single mutations, however, are linked with severe illnesses like cystic fibrosis. While experimentation on the enormous number of mutations across human populations is impractical due to volume and cost, Mother Nature, as Kumar points out, has already done an experiment for us, presenting scientists with a set of benign mutations for each protein. The branch of science known as comparative genomics takes advantage of the genetic information collected from the diversity of life on Earth.We now know that humans display a striking degree of genetic similarity with many other species, particularly non-human primates like chimpanzees, gibbons, gorillas and orangutans, with whom we share over 98 percent of our genes. “Comparative genomics provides the first clues as to what a mutation might mean,” says Kumar. “This is an area that is going to become center stage in personal genomics and medicine.” Techniques in this rapidly expanding field make use of existing Web databanks such as GenBank, which contains more than 100 billion DNA and protein sequence elements collected from all walks of life. “These databases already contain the outcomes of nature’s experiment, which we can harness by using bioinformatics,” says Kumar.DNA medicine typically uses a suite of computer tools to assess whether a newly discovered protein change is potentially disease-causing or benign. Kumar’s study tested the reliability of two of the most widely-used tests, known as SIFT and PolyPhen, by examining over 20,000 mutations from both diagnosed patients and healthy individuals. The results demonstrate that these tests make false predictions of risk up to 40 percent of the time, a rate of reliability that renders them impractical for clinical use.The objective of the study was to identify where SIFT and PolyPhen tend to fail and where their predictions appear to be more reliable. To accomplish this, Kumar’s group examined the proteins in 44 species, from frogs to fish, chimps and gorillas. His group discovered that benign mutations tend to occur in regions of the genome that allow variation over evolutionary time across species. In these regions, it is easier to make accurate predictions of benign mutations.In contrast, DNA information essential for life is persistent from species to species. Many DNA positions permit no change over evolutionary time in order to preserve proper function—mutations here would likely be damaging. Reinforcement of this theory was found in the subset of mutations discovered in disease-associated genes. Such mutations are clustered in positions of the genome that are conserved over evolutionary time or in mutant protein sequences that are rarely seen. Amazingly, less than 10 percent of known single gene disease mutations are ever found in other species.As Kumar notes, evolution has provided researchers with a storehouse of genetic mutations, many of which will prove benign for human health. “Suppose you had a mutation at a certain position,” he explains, “and your dog has the same change as you have. It’s most likely that that change is not harmful.” By the same token, if no other species contains the mutation found in one’s genome, it calls for further investigation.Kumar stresses that it will take a combination of additional DNA sequencing data and improved understanding of protein function to refine the power of computer analyses. In the meantime, his bioinformatics evaluations of current computer tools suggest where such tests may be appropriately used for diagnosis with higher confidence and where their results are more likely to be unreliable. With the costs of rapid DNA sequencing plummeting, individual genetic profiling is already becoming popular, offering every patient access to an enormous treasure trove of medically-relevant information. According to Kumar, the ultimate challenge will be sorting out what all this genetic information implies for each individual’s prognosis. Only then will the promise of personalized medicine be fully realized.
|
|
|
Post by towhom on Sept 4, 2009 4:37:37 GMT 4
The Arctic offers more evidence of human influences on climate changeRecent, sudden and dramatic Arctic warming was preceded by almost 2,000 years of natural coolingEurekAlert Public Release: 3-Sep-2009www.eurekalert.org/pub_releases/2009-09/nsf-tao090209.phpA new study indicates that Arctic temperatures suddenly increased during the last 50 years of the period from 1 AD to the year 2000. Because this warming occurred abruptly during the 20th Century while atmospheric greenhouse gases were accumulating, these findings provide additional evidence that humans are influencing climate. Incorporating geologic records, biologic records and computer simulations, the study reconstructed Arctic summer temperatures at a resolution down to decades, and thereby extends the climate record a full 1,600 years beyond the 400 year-long record that was previously available at that resolution. This newly lengthened record shows that recent warming was preceded by a cooling trend that lasted at least 1,900 years and should have continued throughout the 20th Century. These results indicate that recent warming is more anomalous than previously documented, says Darrell Kaufman of Northern Arizona University--the lead author of the study. Conducted by an international team of scientists and primarily funded by the National Science Foundation (NSF), the study is described in the September 4, 2009, issue of Science. Kaufman says that the results of his team's study are significant not only because of their implications for our understanding of human influences on climate change, but also because they agree with the National Center for Atmospheric Research's (NCAR) climate model, which is used for predicting future climate change; this agreement increases confidence in the model's simulations of future climate change. Recent warming reversed long-term cooling Specifically, the Kaufman et al. study is the first to quantify at a decadal resolution a pervasive cooling across the Arctic from the early part of the first millennium AD to the industrial revolution, according to Kaufman. During this period, summer temperatures in the Arctic cooled at a rate of about 0.2 degrees Celsius per millennium, leading to the 'Little Ice Age', a period of sustained cold that ended around 1850. "Scientists have known for a while that the current period of warming was preceded by a long-term cooling trend," says Kaufman. "But our reconstruction quantifies the cooling with greater certainty than before." The researchers believe that the long cooling trend was caused by a previously recognized wobble in the Earth's axis of rotation that slowly increased the distance between the Earth and the Sun during the Arctic summer, and thereby reduced summer sunshine in the Arctic. (See figure in Science article.) But even though this cooling wobble persisted throughout the 20th Century, by the middle of the 20th Century, summer temperatures in the Arctic were about 0.7 degrees Celsius (33.3 degrees Fahrenheit) higher than would have been expected if the cooling trend had continued. This incongruity provides evidence of human influences on climate change, says Kaufman. What's more, the results of the Kaufman et al. study together with recent records of thermometer readings indicate that the last decade was the warmest of the last two millennia--with Arctic temperatures averaging about 1.4 degrees Celsius (34.5 degrees Fahrenheit) higher than would have been expected if the cooling trend had continued, according to Kaufman. Arctic sensitivity to climate changeKaufman says that his team's study agrees with previous studies that have shown that Arctic temperatures increased during the 20th Century almost three times faster than temperatures increased throughout the rest of the Northern Hemisphere. Called arctic amplification, this phenomenon is caused by increases in the Arctic's absorption of the sun's heat by dark land and exposed ocean as Arctic ice and snow melt away. "The ability of such a slight wobble in the Earth's axis to cause a significant temperature change over the 1,900 year period preceding the onset of recent warming provides further evidence of the sensitivity of the Arctic's climate system," says Kaufman. "Because we know that the processes responsible for past arctic amplification are still operating, we can anticipate that it will continue into the next century," says Gifford Miller of the University of Colorado, Boulder, a member of the study team. "Consequently, Arctic warming will continue to exceed temperature increases in the rest of the Northern Hemisphere, resulting in accelerated loss of land ice and an increased rate of sea-level rise, with global consequences." Real-world records of climate change The 2000-year reconstruction of Arctic temperatures provided by the Kaufman et al. study incorporated three types of field-based data--each of which captured the response of a different component of the Arctic's climate system to changes in temperature. These field-based data included temperature reconstructions that were published by the Kaufman et al. team earlier this year. These reconstructions were based on evidence provided by sediments from Arctic lakes, including algal abundance, which reflects the length of the growing season, and the thickness of annually deposited sediment layers, which increases during warmer summers when deposits from glacial melt-water increase. The Kaufman et al. study also incorporated previously published data from glacial ice and tree rings that was calibrated against the instrumental temperature record. Computer models of climate changeThe Kaufman et al. study also included a 2,000 year-long computer simulation of climate change that incorporated the Earth's slow rotational wobble and resulting reduction in seasonal sunlight in the Arctic. Because the model's estimate of the amount of cooling resulting from the wobble effect matched the cooling reflected in the long record of climate change provided by lake sediments and other natural archives, this analysis increased confidence in the model's ability to accurately predict temperature responses in the Arctic to factors that influence climate change. "This result is particularly important because the Arctic is perhaps the most sensitive area of the Earth to the human factors that influence climate change," says David Schneider of NCAR, who is a member of the research team. "As we are confronted with evidence of global warming, it is extremely helpful to be able to use paleoclimate data to provide context for today's climate relative to the range and trajectory of recent climate regimes," says Neil Swanberg, director of NSF's Arctic System Science Program. "This reconstruction uses a variety of data sources to extend high resolution records back in time sufficiently long to compare reconstructed temperatures to those from models that include changes in insolation due to changes in the Earth's orbital patterns. That the results appear to match so well increases our confidence in our understanding of the processes that are impacting the global Earth system."
|
|
|
Post by towhom on Sept 5, 2009 6:24:21 GMT 4
Water Vapor's Counterintuitive Effect on ClimateTechnology Review / arXiv Blogs Friday, September 04, 2009www.technologyreview.com/blog/arxiv/24079/Climatologists are only beginning to model the role that water vapor plays in atmospheric circulation. But the early results are surprising. Water vapor is by far the most important greenhouse gas in the atmosphere, but it also plays a crucial but far less-well-understood role in atmospheric dynamics, say Tapio Schneider from Caltech and a couple of pals. These guys want to change that by studying the impact on the global atmospheric dynamics of the release of energy as water vapor condenses and how these dynamics might change as the global temperature increases. What they find is both counterintuitive and important. In general, water cools the atmosphere when it evaporates and heats it up as it condenses. But on a large scale, the heating effects are far more important for atmospheric dynamics because there is some 250 times more vapor than liquid in the atmosphere. What Schneider & Co. have done is model the effects this can have on the atmosphere as the temperature changes. "We view past and possible future climates as parts of a climatic continuum that is governed by universal, albeit largely unknown, macroscopic laws. Our goal is to constrain the forms that such macroscopic laws may take," the researchers say. Their main result is surprising and counterintuitive. Take tropical Hadley circulation caused by the evaporation of water at the equator and its subsequent condensation in the subtropics. The conventional view is that if tropical Hadley circulation was weak when the climate was colder and is stronger now, it will be stronger still in an even hotter climate (a so-called monotonic change). But Schneider and friends find something completely different. "Contrary to widely held beliefs, atmospheric circulation statistics can change non-monotonically with global-mean surface temperature, in part because of dynamic effects of water vapor." That's extraordinary. What they're saying is that as the temperature increases, aspects of atmospheric circulation could vary in one way and then back again. For example, they calculate that the strength of the tropical Hadley circulation can be lower than it is now both in much warmer climates and in much colder ones, too (a so-called nonmonotonic change). Clearly, more work is needed, but this preliminary study indicates that nonmonotonic changes may come to be much more significant parts of climate change models in future. Ref: arxiv.org/abs/0908.4410 : Water Vapor and the Dynamics of Climate Change
|
|
|
Post by towhom on Sept 5, 2009 7:14:12 GMT 4
The Singularity and the Fixed PointThe importance of engineering motivation into intelligence.Technology Review / arXiv Blogs Friday, September 04, 2009www.technologyreview.com/biomedicine/23354/Some futurists such as Ray Kurzweil have hypothesized that we will someday soon pass through a singularity--that is, a time period of rapid technological change beyond which we cannot envision the future of society. Most visions of this singularity focus on the creation of machines intelligent enough to devise machines even more intelligent than themselves, and so forth recursively, thus launching a positive feedback loop of intelligence amplification. It's an intriguing thought. (One of the first things I wanted to do when I got to MIT as an undergraduate was to build a robot scientist that could make discoveries faster and better than anyone else.) Even the CTO of Intel, Justin Rattner, has publicly speculated recently that we're well on our way to this singularity, and conferences like the Singularity Summit (at which I'll be speaking in October) are exploring how such transformations might take place. As a brain engineer, however, I think that focusing solely on intelligence augmentation as the driver of the future is leaving out a critical part of the analysis--namely, the changes in motivation that might arise as intelligence amplifies. Call it the need for "machine leadership skills" or "machine philosophy"--without it, such a feedback loop might quickly sputter out. We all know that intelligence, as commonly defined, isn't enough to impact the world all by itself. The ability to pursue a goal doggedly against obstacles, ignoring the grimness of reality (sometimes even to the point of delusion--i.e., against intelligence), is also important. Most science-fiction stories prefer their artificial intelligences to be extremely motivated to do things--for example, enslaving or wiping out humans, if The Matrix and Terminator II have anything to say on the topic. But I find just as plausible the robot Marvin, the superintelligent machine from Douglas Adams' The Hitchhiker's Guide to the Galaxy, who used his enormous intelligence chiefly to sit around and complain, in the absence of any big goal. Indeed, a really advanced intelligence, improperly motivated, might realize the impermanence of all things, calculate that the sun will burn out in a few billion years, and decide to play video games for the remainder of its existence, concluding that inventing an even smarter machine is pointless. (A corollary of this thinking might explain why we haven't found extraterrestrial life yet: intelligences on the cusp of achieving interstellar travel might be prone to thinking that with the galaxies boiling away in just 10 19 years, it might be better just to stay home and watch TV.) Thus, if one is trying to build an intelligent machine capable of devising more intelligent machines, it is important to find a way to build in not only motivation, but motivation amplification--the continued desire to build in self-sustaining motivation, as intelligence amplifies. If such motivation is to be possessed by future generations of intelligence--meta-motivation, as it were--then it's important to discover these principles now. There's a second issue. An intelligent being may be able to envision many more possibilities than a less intelligent one, but that may not always lead to more effective action, especially if some possibilities distract the intelligence from the original goals (e.g., the goal of building a more intelligent intelligence). The inherent uncertainty of the universe may also overwhelm, or render irrelevant, the decision-making process of this intelligence. Indeed, for a very high-dimensional space of possibilities (with the axes representing different parameters of the action to be taken), it might be very hard to evaluate which path is the best. The mind can make plans in parallel, but actions are ultimately unitary, and given finite accessible resources, effective actions will often be sparse. The last two paragraphs apply not only to AI and ET, but also describe features of the human mind that affect decision making in many of us at times--lack of motivation and drive, and paralysis of decision making in the face of too many possible choices. But it gets worse: we know that a motivation can be hijacked by options that simulate the satisfaction that the motivation is aimed toward. Substance addictions plague tens of millions of people in the United States alone, and addictions to more subtle things, including certain kinds of information (such as e-mail), are prominent too. And few arts are more challenging than passing on motivation to the next generation, for the pursuit of a big idea. Intelligences that invent more and more interesting and absorbing technologies, that can better grab and hold their attention, while reducing impact on the world, might enter the opposite of a singularity. What is the opposite of a singularity? The singularity depends on a mathematical recursion: invent a superintelligence, and then it will invent an even more powerful superintelligence. But as any mathematics student knows, there are other outcomes of an iterated process, such as a fixed point. A fixed point is a point that, when a function is applied, gives you the same point again. Applying such a function to points near the fixed point will often send them toward the fixed point. A "societal fixed point" might therefore be defined as a state that self-reinforces, remaining in the status quo--which could in principle be peaceful and self-sustaining, but could also be extremely boring--say, involving lots of people plugged into the Internet watching videos forever. Thus, we as humans might want, sometime soon, to start laying out design rules for technologies so that they will motivate us to some high goal or end--or at least away from dead-end societal fixed points. This process will involve thinking about how technology could help confront an old question of philosophy--namely, "What should I do, given all these possible paths?" Perhaps it is time for an empirical answer to this question, derived from the properties of our brains and the universe we live in.
|
|
|
Post by towhom on Sept 6, 2009 5:26:15 GMT 4
G20 aims at bank pay and capital; stimulus to stayNewsDaily Posted 2009/09/05 at 2:34 pm EDTwww.newsdaily.com/stories/l5327479-us-g20-finance/LONDON — G20 finance leaders on Saturday took aim at excessive bank pay and risk-taking at the root of the financial crisis and insisted trillions of dollars of emergency economic supports would be needed for some time. Although the global economy looks brighter than when the Group of 20 finance ministers and central bankers met in April, their closing statement said they would not remove economic stimulus until the recovery was well entrenched. While the timing of these eventual policy reversals may vary, the G20 said for the first time there should be some coordination to avoid adverse international fallout. But as the focus shifted from crisis-fighting to establishing a safer financial system for the future, ministers searched for consensus on precise plans to rein in bankers' huge bonuses and use more of their profits to build buffers against any future crisis. "We cannot put the world in a position where things go back to where they were at the peak of the boom," U.S. Treasury Secretary Timothy Geithner said. "It cannot happen, will not happen and you can't expect the markets to solve that problem on their own because it's a huge collective action problem...so it has to come through things that countries legislate." EXIT, BUT NOT NOWOn the public stage, the message was one of solidarity as policymakers agreed they must keep spending the $5 trillion already earmarked as economic stimulus and delay any unwinding of emergency fiscal and monetary measures until economies are sturdy enough to stand on their own. "The classic errors of economic policy during crises are that governments tend to act too late with insufficient force and then put the brakes on too early," Geithner said. "We are not going to repeat those mistakes." In a final statement, the G20 officials from rich and developing countries also said they would work with the International Monetary Fund and Financial Stability Board to develop cooperative and coordinated exit strategies. Behind the scenes, some G20 sources expressed frustration that there was not more progress made in curbing excessive pay packages for bankers -- particularly those employed by firms that have received billions of dollars in government support.
"There is broad agreement on what to do. The problem is we need to go beyond agreement. We need to have concrete measures," said International Monetary Fund chief Dominique Strauss-Kahn. "I'm impressed by the level of consensus but I'm still waiting for strong measures to be decided and also to be implemented at the national level."BANK PAY AND BUFFERSMuch of the public pressure before the meeting had centered on excessive bank remuneration, particularly for those who worked at banks receiving billions of dollars in public aid. "It is offensive to the public whose taxpayers' money in different ways has helped (keep) many banks from collapsing and is now underpinning their recovery," British Prime Minister Gordon Brown said at the start of Saturday's meetings. On pay and bonuses in the financial sector, the statement fell short of calling for caps, saying that: "We also ask the Financial Stability Board to explore possible approaches for limiting total variable remuneration in relation to risk and long-term performance."
That was seen as a compromise between France and Germany, which had pushed hard for pay limits, and Britain, the United States and Canada, which were opposed to caps. But it also effectively delayed a tricky political issue until the Pittsburgh summit later this month.Finance leaders broadly agreed that banks ought to hold more capital as a cushion against the sort of catastrophic losses that led to bank failures and bailouts. The final statement said that banks would "be required to hold more and better quality capital once recovery is assured."
Geithner called for "greater urgency" on regulatory reform and cautioned that as the crisis recedes and the economy improves, the momentum for reform may wane.He had surprised many of his colleagues by releasing an 8-point proposal on new capital rules just two days before the G20 meeting, with some ministers saying they did not have time to review it. It is a delicate issue because tighter capital rules would likely hurt banks' profits and restrict their lending, both of which could be harmful to the economy. CHANGING WORLD ORDERThe statement showed agreement that emerging nations like India and China should have a greater say in the running of the International Monetary Fund and World Bank but did not offer up any formula of how this should be achieved. It said only that their voice in global economic policymaking would grow "significantly" and that it expected "substantial progress" to be made on the issue at a summit of world leaders in Pittsburgh later this month. But the group said reforms need only be completed by the existing deadline of 2010 for the World Bank and 2011 for the IMF. The BRIC group of leading emerging powers -- India, China, Russia and Brazil -- had laid out on Friday concrete targets for how much movement they wanted in IMF and World Bank quotas.
|
|
|
Post by jack on Sept 7, 2009 14:54:51 GMT 4
Hello everyone I need some advice... I have 3 kids and am up in the air about weather or not i should have them get the H1N1 vaccine. who here is going to be getting one... need some opinions... from the great minds here.... Have a great day all... jack
|
|