Páginas vistas

jueves, 6 de septiembre de 2012

ENCODE :viaje a la materia oscura del genoma.

El presente número de la revista "Nature" marca un hito en la genómica con 6 trabajos pioneros del proyecto ENCODE. Hasta ahora se pensaba que dentro del genoma humano, que podemos imaginar como una extensa biblioteca escrita en A,T,C, G, de la que apenas un 1,5% significaba algo inteligible. Pero no es así. Lo que los investigadores que publican en "Nature" no dudan en calificar como materia oscura, puede contener muchas sorpresas:secuencias reguladoras de genes, secuencias que desencadenan enfermedades, secuencias con funciones hasta ahora desconocidas. La imaginación nos puede llevar facilmente a experimentos dignos de la ciencia ficción. El siglo XXI es sin duda el de la genómica. Aquí reproduzco los principales resúmenes publicados on line por los autores (Nature Volume: 489, Pages: 52–55)

Published online 05 September 2012 :The Encyclopedia of DNA Elements (ENCODE) project dishes up a hearty banquet of data that illuminate the roles of the functional elements of the human genome. Here, six scientists describe the project and discuss how the data are influencing research directions across many fields. See Articles p.57, p.75, p.83, p.91, p.101 & Letter p.109


Encyclopedia of DNA Elements: The Encyclopedia of DNA Elements (ENCODE) Consortium is an international collaboration of research groups funded by the National Human Genome Research Institute (NHGRI). The goal of ENCODE is to build a comprehensive parts list of functional elements in the human genome, including elements that act at the protein and RNA levels, and regulatory elements that control cells and circumstances in which a gene is active.  (nature.com/encode)

Starting with a list of simple ingredients and blending them in the precise amounts needed to prepare a gourmet meal is a challenging task. In many respects, this task is analogous to the goal of the ENCODE project1, the recent progress of which is described in this issue2, 3, 4, 5, 6, 7. The project aims to fully describe the list of common ingredients (functional elements) that make up the human genome (Fig. 1). When mixed in the right proportions, these ingredients constitute the information needed to build all the types of cells, body organs and, ultimately, an entire person from a single genome.
The ENCODE project 2, 3, 4, 5, 6, 7 provides information on the human genome far beyond that contained within the DNA sequence — it describes the functional genomic elements that orchestrate the development and function of a human. The project contains data about the degree of DNA methylation and chemical modifications to histones that can influence the rate of transcription of DNA into RNA molecules (histones are the proteins around which DNA is wound to form chromatin). ENCODE also examines long-range chromatin interactions, such as looping, that alter the relative proximities of different chromosomal regions in three dimensions and also affect transcription. Furthermore, the project describes the binding activity of transcription-factor proteins and the architecture (location and sequence) of gene-regulatory DNA elements, which include the promoter region upstream of the point at which transcription of an RNA molecule begins, and more distant (long-range) regulatory elements. Another section of the project was devoted to testing the accessibility of the genome to the DNA-cleavage protein DNase I. These accessible regions, called DNase I hypersensitive sites, are thought to indicate specific sequences at which the binding of transcription factors and transcription-machinery proteins has caused nucleosome displacement. In addition, ENCODE catalogues the sequences and quantities of RNA transcripts, from both non-coding and protein-coding regions.
The ENCODE pilot project8 focused on just 1% of the genome — a mere appetizer — and its results hinted that the list of human genes was incomplete. Although there was scepticism about the feasibility of scaling up the project to the entire genome and to many hundreds of cell types, recent advances in low-cost, rapid DNA-sequencing technology radically changed that view9. Now the ENCODE consortium presents a menu of 1,640 genome-wide data sets prepared from 147 cell types, providing a six-course serving of papers in Nature, along with many companion publications in other journals.
One of the more remarkable findings described in the consortium's 'entrée' paper (page 57)2 is that 80% of the genome contains elements linked to biochemical functions, dispatching the widely held view that the human genome is mostly 'junk DNA'. The authors report that the space between genes is filled with enhancers (regulatory DNA elements), promoters (the sites at which DNA's transcription into RNA is initiated) and numerous previously overlooked regions that encode RNA transcripts that are not translated into proteins but might have regulatory roles. Of note, these results show that many DNA variants previously correlated with certain diseases lie within or very near non-coding functional DNA elements, providing new leads for linking genetic variation and disease.
The five companion articles3, 4, 5, 6, 7 dish up diverse sets of genome-wide data regarding the mapping of transcribed regions, DNA binding of regulatory proteins (transcription factors) and the structure and modifications of chromatin (the association of DNA and proteins that makes up chromosomes), among other delicacies.
“These findings force a rethink of the definition of a gene and of the minimum unit of heredity.”
Djebali and colleagues3 (page 101) describe ultra-deep sequencing of RNAs prepared from many different cell lines and from specific compartments within the cells. They conclude that about 75% of the genome is transcribed at some point in some cells, and that genes are highly interlaced with overlapping transcripts that are synthesized from both DNA strands. These findings force a rethink of the definition of a gene and of the minimum unit of heredity.
Moving on to the second and third courses, Thurman et al.4 and Neph et al.5 (pages 75 and 83) have prepared two tasty chromatin-related treats. Both studies are based on the DNase I hypersensitivity assay, which detects genomic regions at which enzyme access to, and subsequent cleavage of, DNA is unobstructed by chromatin proteins. The authors identified cell-specific patterns of DNase I hypersensitive sites that show remarkable concordance with experimentally determined and computationally predicted binding sites of transcription factors. Moreover, they have doubled the number of known recognition sequences for DNA-binding proteins in the human genome, and have revealed a 50-base-pair 'footprint' that is present in thousands of promoters5.
The next course, provided by Gerstein and colleagues6 (page 91) examines the principles behind the wiring of transcription-factor networks. In addition to assigning relatively simple functions to genome elements (such as 'protein X binds to DNA element Y'), this study attempts to clarify the hierarchies of transcription factors and how the intertwined networks arise.
Beyond the linear organization of genes and transcripts on chromosomes lies a more complex (and still poorly understood) network of chromosome loops and twists through which promoters and more distal elements, such as enhancers, can communicate their regulatory information to each other. In the final course of the ENCODE genome feast, Sanyal and colleagues7 (page 109) map more than 1,000 of these long-range signals in each cell type. Their findings begin to overturn the long-held (and probably oversimplified) prediction that the regulation of a gene is dominated by its proximity to the closest regulatory elements.
One of the major future challenges for ENCODE (and similarly ambitious projects) will be to capture the dynamic aspects of gene regulation. Most assays provide a single snapshot of cellular regulatory events, whereas a time series capturing how such processes change is preferable. Additionally, the examination of large batches of cells — as required for the current assays — may present too simplified a view of the underlying regulatory complexity, because individual cells in a batch (despite being genetically identical) can sometimes behave in different ways. The development of new technologies aimed at the simultaneous capture of multiple data types, along with their regulatory dynamics in single cells, would help to tackle these issues.
A further challenge is identifying how the genomic ingredients are combined to assemble the gene networks and biochemical pathways that carry out complex functions, such as cell-to-cell communication, which enable organs and tissues to develop. An even greater challenge will be to use the rapidly growing body of data from genome-sequencing projects to understand the range of human phenotypes (traits), from normal developmental processes, such as ageing, to disorders such as Alzheimer's disease10.
Achieving these ambitious goals may require a parallel investment of functional studies using simpler organisms — for example, of the type that might be found scampering around the floor, snatching up crumbs in the chefs' kitchen. All in all, however, the ENCODE project has served up an all-you-can-eat feast of genomic data that we will be digesting for some time. Bon appétit!
Once the human genome had been sequenced, it became apparent that an encyclopaedic knowledge of chromatin organization would be needed if we were to understand how gene expression is regulated. The ENCODE project goes a long way to achieving this goal and highlights the pivotal role of transcription factors in sculpting the chromatin landscape.
Although some of the analyses largely confirm conclusions from previous smaller-scale studies, this treasure trove of genome-wide data provides fresh insight into regulatory pathways and identifies prodigious numbers of regulatory elements. This is particularly so for Thurman and colleagues' data4 regarding DNase I hypersensitive sites (DHSs) and for Gerstein and colleagues' results6 concerning DNA binding of transcription factors. DHSs are genomic regions that are accessible to enzymatic cleavage as a result of the displacement of nucleosomes (the basic units of chromatin) by DNA-binding proteins (Fig. 1). They are the hallmark of cell-type-specific enhancers, which are often located far away from promoters.

The ENCODE papers expose the profusion of DHSs — more than 200,000 per cell type, far outstripping the number of promoters — and their variability between cell types. Through the simultaneous presence in the same cell type of a DHS and a nearby active promoter, the researchers paired half a million enhancers with their probable target genes. But this leaves more than 2 million putative enhancers without known targets, revealing the enormous expanse of the regulatory genome landscape that is yet to be explored. Chromosome-conformation-capture methods that detect long-range physical associations between distant DNA regions are attempting to bridge this gap. Indeed, Sanyal and colleagues7 applied these techniques to survey such associations across 1% of the genome.
The ENCODE data start to paint a picture of the logic and architecture of transcriptional networks, in which DNA binding of a few high-affinity transcription factors displaces nucleosomes and creates a DHS, which in turn facilitates the binding of further, lower-affinity factors. The results also support the idea that transcription-factor binding can block DNA methylation (a chemical modification of DNA that affects gene expression), rather than the other way around — which is highly relevant to the interpretation of disease-associated sites of altered DNA methylation11.
The exquisite cell-type specificity of regulatory elements revealed by the ENCODE studies emphasizes the importance of having appropriate biological material on which to test hypotheses. The researchers have focused their efforts on a set of well-established cell lines, with selected assays extended to some freshly isolated cells. Challenges for the future include following the dynamic changes in the regulatory landscape during specific developmental pathways, and understanding chromatin structure in tissues containing heterogeneous cell populations.

Non-coding but functional
The vast majority of the human genome does not code for proteins and, until now, did not seem to contain defined gene-regulatory elements. Why evolution would maintain large amounts of 'useless' DNA had remained a mystery, and seemed wasteful. It turns out, however, that there are good reasons to keep this DNA. Results from the ENCODE project2, 3, 4, 5, 6, 7, 8 show that most of these stretches of DNA harbour regions that bind proteins and RNA molecules, bringing these into positions from which they cooperate with each other to regulate the function and level of expression of protein-coding genes. In addition, it seems that widespread transcription from non-coding DNA potentially acts as a reservoir for the creation of new functional molecules, such as regulatory RNAs.
What are the implications of these results for genetic studies of complex human traits and disease? Genome-wide association studies (GWAS), which link variations in DNA sequence with specific traits and diseases, have in recent years become the workhorse of the field, and have identified thousands of DNA variants associated with hundreds of complex traits (such as height) and diseases (such as diabetes). But association is not causality, and identifying those variants that are causally linked to a given disease or trait, and understanding how they exert such influence, has been difficult. Furthermore, most of these associated variants lie in non-coding regions, so their functional effects have remained undefined.
“The results imply that sequencing studies focusing on protein-coding sequences risk missing crucial parts of the genome.”
The ENCODE project provides a detailed map of additional functional non-coding units in the human genome, including some that have cell-type-specific activity. In fact, the catalogue contains many more functional non-coding regions than genes. These data show that results of GWAS are typically enriched for variants that lie within such non-coding functional units, sometimes in a cell-type-specific manner that is consistent with certain traits, suggesting that many of these regions could be causally linked to disease. Thus, the project demonstrates that non-coding regions must be considered when interpreting GWAS results, and it provides a strong motivation for reinterpreting previous GWAS findings. Furthermore, these results imply that sequencing studies focusing on protein-coding sequences (the 'exome') risk missing crucial parts of the genome and the ability to identify true causal variants.
However, although the ENCODE catalogues represent a remarkable tour de force, they contain only an initial exploration of the depths of our genome, because many more cell types must yet be investigated. Some of the remaining challenges for scientists searching for causal disease variants lie in: accessing data derived from cell types and tissues relevant to the disease under study; understanding how these functional units affect genes that may be distantly located7; and the ability to generalize such results to the entire organism.

Evolution and the code
One of the great challenges in evolutionary biology is to understand how differences in DNA sequence between species determine differences in their phenotypes. Evolutionary change may occur both through changes in protein-coding sequences and through sequence changes that alter gene regulation.

There is growing recognition of the importance of this regulatory evolution, on the basis of numerous specific examples as well as on theoretical grounds. It has been argued that potentially adaptive changes to protein-coding sequences may often be prevented by natural selection because, even if they are beneficial in one cell type or tissue, they may be detrimental elsewhere in the organism. By contrast, because gene-regulatory sequences are frequently associated with temporally and spatially specific gene-expression patterns, changes in these regions may modify the function of only certain cell types at specific times, making it more likely that they will confer an evolutionary advantage12.
However, until now there has been little information about which genomic regions have regulatory activity. The ENCODE project has provided a first draft of a 'parts list' of these regulatory elements, in a wide range of cell types, and moves us considerably closer to one of the key goals of genomics: understanding the functional roles (if any) of every position in the human genome.
Nonetheless, it will take a great deal of work to identify the critical sequence changes in the newly identified regulatory elements that drive functional differences between humans and other species. There are some precedents for identifying key regulatory differences (see, for example, ref. 13), but ENCODE's improved identification of regulatory elements should greatly accelerate progress in this area. The data may also allow researchers to begin to identify sequence alterations occurring simultaneously in multiple genomic regions, which, when added together, drive phenotypic change — a process called polygenic adaptation14.
However, despite the progress brought by the ENCODE consortium and other research groups, it remains difficult to discern with confidence which variants in putative regulatory regions will drive functional changes, and what these changes will be. We also still have an incomplete understanding of how regulatory sequences are linked to target genes. Furthermore, the ENCODE project focused mainly on the control of transcription, but many aspects of post-transcriptional regulation, which may also drive evolutionary changes, are yet to be fully explored.
Nonetheless, these are exciting times for studies of the evolution of gene regulation. With such new resources in hand, we can expect to see many more descriptions of adaptive regulatory evolution, and how this has contributed to human evolution.

From catalogue to function
Projects that produce unprecedented amounts of data, such as the human genome project15 or the ENCODE project, present new computational and data-analysis challenges and have been a major force driving the development of computational methods in genomics. The human genome project produced one bit of information per DNA base pair, and led to advances in algorithms for sequence matching and alignment. By contrast, in its 1,640 genome-wide data sets, ENCODE provides a profile of the accessibility, methylation, transcriptional status, chromatin structure and bound molecules for every base pair. Processing the project's raw data to obtain this functional information has been an immense effort.
“The high quality of the functional information produced is evident from the exquisite detail and accuracy achieved.”
For each of the molecular-profiling methods used, the ENCODE researchers devised novel processing algorithms designed to remove outliers and protocol-specific biases, and to ensure the reliability of the derived functional information. These processing pipelines and quality-control measures have been adapted by the research community as the standard for the analysis of such data. The high quality of the functional information they produce is evident from the exquisite detail and accuracy achieved, such as the ability to observe the crystallographic topography of protein–DNA interfaces in DNase I footprints5, and the observation of more than one-million-fold variation in dynamic range in the concentrations of different RNA transcripts3.
But beyond these individual methods for data processing, the profound biological insights of ENCODE undoubtedly come from computational approaches that integrated multiple data types. For example, by combining data on DNA methylation, DNA accessibility and transcription-factor expression. Thurman et al.4 provide fascinating insight into the causal role of DNA methylation in gene silencing. They find that transcription-factor binding sites are, on average, less frequently methylated in cell types that express those transcription factors, suggesting that binding-site methylation often results from a passive mechanism that methylates sites not bound by transcription factors.
Despite the extensive functional information provided by ENCODE, we are still far from the ultimate goal of understanding the function of the genome in every cell of every person, and across time within the same person. Even if the throughput rate of the ENCODE profiling methods increases dramatically, it is clear that brute-force measurement of this vast space is not feasible. Rather, we must move on from descriptive and correlative computational analyses, and work towards deriving quantitative models that integrate the relevant protein, RNA and chromatin components. We must then describe how these components interact with each other, how they bind the genome and how these binding events regulate transcription.

If successful, such models will be able to predict the genome's function at times and in settings that have not been directly measured. By allowing us to determine which assumptions regarding the physical interactions of the system lead to models that better explain measured patterns, the ENCODE data provide an invaluable opportunity to address this next immense computational challenge.

miércoles, 5 de septiembre de 2012

Las tramas ocultas del narcotráfico en México

America’s Secret Deal with Mexican Drug Cartels
By Tom Burghardt

Global Research, September 03, 2012

Url of this article: http://www.globalresearch.ca/managing-the-plaza-americas-secret-deal-with-mexican-drug-cartels/

Los entresijos del mundo del narcotráfico, una plaga que asola el mundo entero, es el asunto que preocupa a Tom Bughardt en un artículo bien documentado y serio, que apunta a complicidades sorprendentes pero probablemente ciertas. El blanquear dinero procedente de la droga es un negocio floreciente y sin fronteras.

In a story which should have made front page headlines, Narco News investigative journalist Bill Conroy revealed that “A high-ranking Sinaloa narco-trafficking organization member’s claim that US officials have struck a deal with the leadership of the Mexican ‘cartel’ appears to be corroborated in large part by the statements of a Mexican diplomat in email correspondence made public recently by the nonprofit media group WikiLeaks.”

A series of some five million emails, The Global Intelligence Files, were obtained by the secret-spilling organization as a result of last year’s hack by Anonymous of the Texas-based “global intelligence” firm Stratfor.

Bad tradecraft aside, the Stratfor dump offer readers insight into a shadowy world where information is sold to the highest bidder through a “a global network of informants who are paid via Swiss banks accounts and pre-paid credit cards. Stratfor has a mix of covert and overt informants, which includes government employees, embassy staff and journalists around the world.”

One of those informants was a Mexican intelligence officer with the Centro de Investigación y Seguridad Nacional, or CISEN, Mexico’s equivalent to the CIA. Dubbed “MX1? by Stratfor, he operates under diplomatic cover at the Mexican consulate in Phoenix, Arizona after a similar posting at the consulate in El Paso, Texas.

His cover was blown by the intelligence grifters when they identified him in their correspondence as Fernando de la Mora, described by Stratfor as “being molded to be the Mexican ‘tip of the spear’ in the U.S.”

In an earlier Narco News story, Conroy revealed that “US soldiers are operating inside Mexico as part of the drug war and the Mexican government provided critical intelligence to US agents in the now-discredited Fast and Furious gun-running operation,” the Mexican diplomat claimed in email correspondence.

Those emails disclosed “details of a secret meeting between US and Mexican officials held in 2010 at Fort Bliss, a US Army installation located near El Paso, Texas. The meeting was part of an effort to create better communications between US undercover operatives in Mexico and the Mexican federal police, the Mexican diplomat reveals.”

“However,” Conroy wrote, “the diplomat expresses concern that the Fort Bliss meeting was infiltrated by the ‘cartels,’ whom he contends have ‘penetrated both US and Mexican law enforcement’.”

Such misgivings are thoroughly justified given the fact, as Antifascist Calling reported last spring, that the Mexican government had arrested three high-ranking Army generals over their links to narcotrafficking organizations.

In Conroy’s latest piece the journalist disclosed that the “Mexican diplomat’s assessment of the US and Mexican strategy in the war on drugs, as revealed by the email trail, paints a picture of a ‘simulated war’ in which the Mexican and US governments are willing to show favor to a dominant narco-trafficking organization in order to minimize the violence and business disruption in the major drug plazas, or markets.”

A “simulated war”? Where have we heard that before? Like the bogus “War on Terror” which arms and unleashes throat-slitting terrorists from the CIA’s favorite all-purpose zombie army of “Islamist extremists,” Al Qaeda, similarly, America’s fraudulent “War on Drugs” has been a splendid means of managing the global drug trade in the interest of securing geopolitical advantage over their rivals.

That major financial powerhouses in Europe and the U.S. (can you say Bank of America, Barclays, Citigroup, Credit Suisse, HSBC, ING and Wachovia) have been accused of reaping the lions’ share of profits derived from the grim trade, now a veritable Narco-Industrial Complex, the public continues to be regaled with tales that this ersatz war is being “won.”

While the Mexican body count continues to rise (nearly 120,000 dead since 2006 according to the latest estimates published by the Instituto Nacional de Estadística y Geografía, or INEGI, as reported by the Paris daily Le Monde in a recent editorial) the United States is escalating its not-so-covert military involvement in Mexico and putting proverbial boots on the ground as part of the $1.6 billion U.S.-financed Mérida Initiative.

But have such “initiatives” (in actuality, taxpayer-funded boondoggles for giant military contractors), turned the corner in the drug war? Not if estimates published the United Nations are accurate.

According to the 2011 World Drug Report, published by the United Nations Office on Drugs and Crime (UNODC): “US authorities have estimated for the last couple of years that some 90% of the cocaine consumed in North America comes from Colombia, supplemented by some cocaine from Peru and limited amounts from the Plurinational State of Bolivia. For the year 2009, results of the US Cocaine Signature Program, based on an analysis of approximately 3,000 cocaine HCl samples, revealed that 95.5% originated in Colombia (down from 99% in 2002) and 1.7% in Peru; for the rest (2.8%), the origin could not be determined. The trafficking of cocaine into the United States is nowadays largely controlled by various Mexican drug cartels, while until the mid-1990s, large Colombian cartels dominated these operations.”

Despite more than $8 billion lavished on programs such as Plan Colombia, and despite evidence that leading Colombian politicians, including former President Álvaro Uribe and his entourage had documented links to major drug trafficking organizations that go back decades, the myth persists that pouring money into the drug war sinkhole will somehow turn the tide.

But drug seizures by U.S. agencies only partially tell the tale.

As UNODC Executive Director Yury Fedotov pointed out in the introduction to the agency’s 2011 report, Estimating Illicit Financial Flows Resulting from Drug Trafficking and Other Transnational Crimes, “all criminal proceeds are likely to have amounted to some 3.6 per cent of GDP (2.3-5.5 per cent) or around US$2.1 trillion in 2009.”

UNODC analysts disclosed that illicit money flows related to “transnational organized crime, represent the equivalent of some 1.5 percent of global GDP, 70 percent of which would have been available for laundering through the financial system. The largest income for transnational organized crime seems to come from illicit drugs, accounting for a fifth of all crime proceeds.”

“If only flows related to drug trafficking and other transnational organized crime activities were considered,” UNODC asserted, “related proceeds would have been equivalent to around US$650 billion per year in the first decade of the new millennium, equivalent to 1.5% of global GDP or US$870 billion in 2009 assuming that the proportions remained unchanged. The funds available for laundering through the financial system would have been equivalent to some 1% of global GDP or US$580 billion in 2009.”

“The results,” according to UNODC, “also suggest that the ‘interception rate’ for anti-money-laundering efforts at the global level remains low. Globally, it appears that much less than 1% (probably around 0.2%) of the proceeds of crime laundered via the financial system are seized and frozen.”

Commenting on the nexus between global drug mafias and our capitalist overlords, former UNODC director Antonio Maria Costa told The Observer in 2009, “that the proceeds of organised crime were ‘the only liquid investment capital’ available to some banks on the brink of collapse last year. He said that a majority of the $352bn (£216bn) of drugs profits was absorbed into the economic system as a result.”

Would there be an incentive then, for U.S. officials to dismantle a global business that benefits their real constituents, the blood-sucking gangsters at the apex of the capitalist financial pyramid? Hardly.

Nor would there be any incentive for American drug warriors to target organizations that inflate the balance sheets of the big banks. Wouldn’t they be more likely then, given the enormous flows of illicit cash flooding the system, to negotiate an “arrangement” with the biggest players, particularly the Sinaloa Cartel run by fugitive billionaire Joaquín “El Chapo” Guzmán?

In fact, as Narco News disclosed last December, a “quid-pro-quo arrangement is precisely what indicted narco-trafficker Jesus Vicente Zambada Niebla, who is slated to stand trial in Chicago this fall, alleges was agreed to by the US government and the leaders of the Sinaloa ‘Cartel’–the dominant narco-trafficking organization in Mexico. The US government, however, denies that any such arrangement exists.”

Narco News reported that according to “Zambada Niebla, he and the rest of the Sinaloa leadership, through the US informant Loya Castro, negotiated an immunity deal with the US government in which they were guaranteed protection from prosecution in exchange for providing US law enforcers and intelligence agencies with information that could be used to compromise rival Mexican cartels and their operations.”

In court pleadings, Zambada Niebla’s attorneys argued that “the United States government considered the arrangements with the Sinaloa Cartel an acceptable price to pay, because the principal objective was the destruction and dismantling of rival cartels by using the assistance of the Sinaloa Cartel–without regard for the fact that tons of illicit drugs continued to be smuggled into Chicago and other parts of the United States and consumption continued virtually unabated.”

Those assertions seem to be borne out by emails released by WikiLeaks. Conroy disclosed: “In a Stratfor email dated April 19, 2010, MX1 lays out the Mexican government’s negotiating, or ‘signaling,’ strategy with respect to the major narco-trafficking organizations as follows:

The Mexican strategy is not to negotiate directly.

In any event, “negotiations” would take place as follows:

Assuming a non-disputed plaza [a major drug market, such as Ciudad Juarez]:

• [If] they [a big narco-trafficking group] bring [in] some drugs, transport some drugs, [and] they are discrete, they don’t bother anyone, [then] no one gets hurt;

• [And the] government turns the other way.

martes, 4 de septiembre de 2012

Los manejos secretos del 9/11

9/11 Attacks: Criminal Foreknowledge and Insider Trading lead directly to the CIA's Highest Ranks
CIA Executive Director "Buzzy" Krongard managed Firm that handled "Put" Options on UAL
By Michael C. Ruppert

URL of this article: www.globalresearch.ca/index.php?context=va&aid=32323

Global Research Editor's note
As September approaches, we are reminded that the anniversary of the tragic events of 9/11 will soon be upon us once again. 11 years later, are we any closer to the truth about what really happened on that fateful day? For the next month until September 11, 2012, we will be posting on a daily basis important articles from our early archives pertaining to the tragic events of 9/11. The following text by Michael C. Ruppert published in October 2001 brings to the forefront the issue of foreknowledge and insider trading pertaining to airline listings on the Chicago Board Options Exchange including United Airlines and American Airlines.
Michel Chossudovsky, Global Research, August 13, 2012

Han pasado 11 años desde los atentados del 9/11 en EEUU, pero en opinión de algunos estudiosos,  quedan muchas zonas de sombra en torno a lo que realmente sucedió. Las conexiones con los grandes centros financieros son el objetivo analizado por Michael C. Ruppert y publicado por primera vez en 2001.

Suppressed Details of 9/11 Criminal Insider Trading lead directly into the CIA`s Highest Ranks

CIA Executive Director "Buzzy" Krongard managed Firm that handled "put" Options on UAL

by Michael C. Ruppert

FTW Publications, 9 October 2001, Centre for Research on Globalisation, globalresearch.ca, 20 October 2001
Although uniformly ignored by the mainstream U.S. media, there is abundant and clear evidence that a number of transactions in financial markets indicated specific (criminal) foreknowledge of the September 11 attacks on the World Trade Center and the Pentagon. That evidence also demonstrates that, in the case of at least one of these trades -- which has left a $2.5 million prize unclaimed -- the firm used to place the "put options" on United Airlines stock was, until 1998, managed by the man who is now in the number three Executive Director position at the Central Intelligence Agency. Until 1997 A.B. "Buzzy" Krongard had been Chairman of the investment bank A.B. Brown. A.B. Brown was acquired by Banker's Trust in 1997. Krongard then became, as part of the merger, Vice Chairman of Banker's Trust-AB Brown, one of 20 major U.S. banks named by Senator Carl Levin this year as being connected to money laundering. Krongard's last position at Banker's Trust (BT) was to oversee "private client relations." In this capacity he had direct hands-on relations with some of the wealthiest people in the world in a kind of specialized banking operation that has been identified by the U.S. Senate and other investigators as being closely connected to the laundering of drug money.
Krongard (re?) joined the CIA in 1998 as counsel to CIA Director George Tenet. He was promoted to CIA Executive Director by President Bush in March of this year. BT was acquired by Deutsche Bank in 1999. The combined firm is the single largest bank in Europe. And, as we shall see, Deutsche Bank played several key roles in events connected to the September 11 attacks.

The Scope of Known Insider Trading
Before looking further into these relationships it is necessary to look at the insider trading information that is being ignored by Reuters, The New York Times and other mass media. It is well documented that the CIA has long monitored such trades - in real time - as potential warnings of terrorist attacks and other economic moves contrary to U.S. interests. Previous stories in FTW have specifically highlighted the use of Promis software to monitor such trades.
It is necessary to understand only two key financial terms to understand the significance of these trades. "Selling Short" is the borrowing of stock, selling it at current market prices, but not being required to actually produce the stock for some time. If the stock falls precipitously after the short contract is entered, the seller can then fulfill the contract by buying the stock after the price has fallen and complete the contract at the pre-crash price. These contracts often have a window of as long as four months. "Put Options," purchased at nominal prices of, for example, $1.00 per share, are sold in blocks of 100 shares. If exercised, they give the holder the option of selling selected stocks at a future date at a price set when the contract is issued. Thus, for an investment of $10,000 it might be possible to tie up 10,000 shares of United or American Airlines at $100 per share, and the seller of the option is then obligated to buy them if the option is executed. If the stock has fallen to $50 when the contract matures, the holder of the option can purchase the shares for $50 and immediately sell them for $100 - regardless of where the market then stands.
A "call option" is the reverse of a put option, which is, in effect, a derivatives bet that the stock price will go up.
A September 21 story by the Israeli Herzliyya International Policy Institute for Counterterrorism, entitled "Black Tuesday: The World's Largest Insider Trading Scam?" documented the following trades connected to the September 11 attacks:
Between September 6 and 7, the Chicago Board Options Exchange saw purchases of 4,744 put options on United Airlines, but only 396 call options... Assuming that 4,000 of the options were bought by people with advance knowledge of the imminent attacks, these "insiders" would have profited by almost $5 million.
On September 10, 4,516 put options on American Airlines were bought on the Chicago exchange, compared to only 748 calls. Again, there was no news at that point to justify this imbalance;... Again, assuming that 4,000 of these options trades represent "insiders," they would represent a gain of about $4 million. [The levels of put options purchased above were more than six times higher than normal.]
No similar trading in other airlines occurred on the Chicago exchange in the days immediately preceding Black Tuesday.
Morgan Stanley Dean Witter & Co., which occupied 22 floors of the World Trade Center, saw 2,157 of its October $45 put options bought in the three trading days before Black Tuesday; this compares to an average of 27 contracts per day before September 6. Morgan Stanley's share price fell from $48.90 to $42.50 in the aftermath of the attacks. Assuming that 2,000 of these options contracts were bought based upon knowledge of the approaching attacks, their purchasers could have profited by at least $1.2 million.
Merrill Lynch & Co., which occupied 22 floors of the World Trade Center, saw 12,215 October $45 put options bought in the four trading days before the attacks; the previous average volume in those shares had been 252 contracts per day [a 1200% increase!]. When trading resumed, Merrill's shares fell from $46.88 to $41.50; assuming that 11,000 option contracts were bought by "insiders," their profit would have been about $5.5 million.
European regulators are examining trades in Germany's Munich Re, Switzerland's Swiss Re, and AXA of France, all major reinsurers with exposure to the Black Tuesday disaster. [FTW Note: AXA also owns more than 25% of American Airlines stock making the attacks a "double whammy" for them.]
On September 29, 2001 - in a vital story that has gone unnoticed by the major media - the San Francisco Chronicle reported, "Investors have yet to collect more than $2.5 million in profits they made trading options in the stock of United Airlines before the Sept. 11, terrorist attacks, according to a source familiar with the trades and market data.
"The uncollected money raises suspicions that the investors - whose identities and nationalities have not been made public - had advance knowledge of the strikes." They don't dare show up now. The suspension of trading for four days after the attacks made it impossible to cash-out quickly and claim the prize before investigators started looking.
"... October series options for UAL Corp. were purchased in highly unusual volumes three trading days before the terrorist attacks for a total outlay of $2,070; investors bought the option contracts, each representing 100 shares, for 90 cents each. [This represents 230,000 shares]. Those options are now selling at more than $12 each. There are still 2,313 so-called "put" options outstanding [valued at $2.77 million and representing 231,300 shares] according to the Options Clearinghouse Corp."
"...The source familiar with the United trades identified Deutsche Bank Alex. Brown, the American investment banking arm of German giant Deutsche Bank, as the investment bank used to purchase at least some of these options..."
As reported in other news stories, Deutsche Bank was also the hub of insider trading activity connected to Munich Re. just before the attacks.
CIA, the Banks and the Brokers
Understanding the interrelationships between CIA and the banking and brokerage world is critical to grasping the already frightening implications of the above revelations. Let's look at the history of CIA, Wall Street and the big banks by looking at some of the key players in CIA's history. Clark Clifford - The National Security Act of 1947 was written by Clark Clifford, a Democratic Party powerhouse, former Secretary of Defense, and one-time advisor to President Harry Truman. In the 1980s, as Chairman of First American Bancshares, Clifford was instrumental in getting the corrupt CIA drug bank BCCI a license to operate on American shores. His profession: Wall Street lawyer and banker.
John Foster and Allen Dulles - These two brothers "designed" the CIA for Clifford. Both were active in intelligence operations during WW II. Allen Dulles was the U.S. Ambassador to Switzerland where he met frequently with Nazi leaders and looked after U.S. investments in Germany. John Foster went on to become Secretary of State under Dwight Eisenhower and Allen went on to serve as CIA Director under Eisenhower and was later fired by JFK. Their professions: partners in the most powerful - to this day - Wall Street law firm of Sullivan, Cromwell.