“Let me guess… You must be, Xerxes.”
– King Leonidas from the movie “300”
The New Manhattan Project (NMP) requires an incredibly huge command and control apparatus. Over 1000 airplanes need to be commanded. The ionospheric heaters need to be operated. Atmospheric conditions need to be monitored and analyzed. Computers are needed to assist throughout. Today’s military refers to such an apparatus as C4: command, control, communications and computers.
The development of these types of technologies used in weather modification and the atmospheric sciences is well documented. It’s way too big to hide. This paper examines the historical development of these technologies, the currently known state of the art, and possible undisclosed operations. Specifically, this article examines the development and current status of C4 technology used as part of today’s New Manhattan Project. If you do not know what the New Manhattan Project is, please see the author’s previous article “Chemtrails Exposed: A History of the New Manhattan Project.”
Mapping the atmosphere
Before our atmosphere could be commanded and controlled, it was necessary to understand its composition and movements. Pertaining to this quest for understanding, this section recounts some of the most notable efforts.
The weather that we experience is a product of planet Earth, its atmosphere, and the Sun. Our atmosphere consists of many layers. In ascending order, our atmosphere consists of: the troposphere (where we breathe the air and our weather occurs), the stratosphere, the mesosphere, the thermosphere, the ionosphere, and the magnetosphere. Phenomena in all of these regions, stretching to 32,000 miles above Earth’s crust, have a direct effect upon the weather we see every day. Not only that, but the water, ice, volcanoes, calderas, and other features of the Earth’s surface and sub-surface have direct relevance here. Because of this, in order to modify the weather, all of these regions must be understood and manipulated. As this piece unfolds, we will see that this understanding has been achieved.
Terrestrial weather data networks began in the 1840s with the development of the telegraph.
The first scientific explorations of our atmosphere were conducted using balloons, dropsondes and sounding rockets. Weather balloons would be floated up to 100,000 ft. into the stratosphere where attached devices would record atmospheric conditions and then would fall back to Earth to be collected. Later balloons used radio transmissions (radiosondes) to send atmospheric data back to meteorologists on the ground. Dropsondes are devices dropped from aircraft at altitude. A dropsonde will have a parachute that opens up so that the device can more slowly return to Earth as it gathers atmospheric data. Sounding rockets became prevalent starting in the mid-1940s. Rockets can go much higher than balloons or aircraft (200,000 ft.) and therefore gather higher-elevation atmospheric data; thus giving us a more complete picture of our atmosphere. This picture of our atmosphere would soon grow exponentially.
Later sounding rocket experiments were designed to coordinate with ground-based electromagnetic energy generators called ionospheric heaters in order to map the auroral electrojet. The auroral electrojet is comprised of the Earth’s natural magnetic energy.
The Earth is a giant magnet. These strong magnetic fields enter and exit the Earth at the poles and surround the Earth in a toroidal fashion. Many of these later sounding rocket experiments were conducted near the Arctic Circle because at that high latitude, the auroral electrojet is at lower elevations and therefore more easily observed. These sounding rockets carried payloads consisting of chemicals used to enable observations of the auroral electrojet. When the rocket got to around its apex, the nosecone would explode and the chemical payload would be released. Scientists on the ground and in aircraft used photography and ground-based ionospheric heaters to make observations and thus map the auroral electrojet. If you do not know what an ionospheric heater is, please see the author’s previous article “Smoking Gun: The HAARP and Chemtrails Connection.”
As one of the leaders in the field, T. Neil Davis wrote,
Since 1955, various materials have been injected into the high atmosphere or the magnetosphere above for the purpose of creating observable markers or perturbations in the ambient medium and ongoing processes within it. Among the materials injected are sodium, trimethyl aluminate, aluminum oxide, nitric oxide, water, sulphur hexafluoride, strontium, cesium, lithium, barium and beams of energetic electrons. Injected in the proper fashion, each of these materials either emits in the visible band or interacts in some other way with the medium to produce visible emissions or visible modifications to natural emissions. Consequently, optical techniques have been extensively used to observe the effects of ionospheric releases. Other methods, including radio frequency, magnetic and in situ particle counting have been used to observe the injected materials or their effects.
Through the use of chemical releases, it has been possible to investigate a number of quantities including high-altitude winds and electric fields, the detailed configurations of the geomagnetic field within the ionosphere and the magnetosphere, as well as the propagation of energetic particle beams and their interaction with natural and ionised constituents of the high atmosphere.
Neil Davis’ quite entertaining and informative book Rockets Over Alaska: The Genesis of Poker Flat details how in the late 1960s and early 1970s the Defense Advanced Research Projects Agency (DARPA), the Defense Atomic Support Agency, and the Atomic Energy Commission jointly conducted a rocket launching program out of Poker Flat, Alaska; in the interior, near Fairbanks. It was supported by the General Electric Space Sciences Laboratory. The then Stanford Research Institute’s Radio Physics Laboratory was heavily involved. This laboratory later became the source of a lot of HAARP research. Raytheon and many, many others were there. It was called SECEDE I & II. They had an ionospheric heater on site providing the electromagnetic energy. As of 2006 (the year of his book’s publication), Mr. Davis says that since 1969 there have been at least 100 chemical payload release rockets launched from Poker Flat. Poker Flat appears to be in use today. Similar operations by many different experimenters have been carried out all over the world. Worldwide, all time we’re talking about dumping at least metric tons of all sorts of highly toxic materials into the atmosphere.
There is also, believe it or not, quite a documented history of satellites dispersing chemicals for the purpose of mapping the upper atmosphere.
After balloons and sounding rockets, satellites have traditionally been used to quantify our atmosphere. Although the vast majority of satellites have been and are used for media telecommunications, there is a grand history of satellites being employed in weather modification and the atmospheric sciences. Over the years, the technology has gotten quite good. Historically, the space has been dominated by the National Aeronautics and Space Administration (NASA).
A textbook called Satellite Technology: Principles and Applications recounts some satellite technology firsts. It reads,
The year 1959 marked a significant beginning in the field of satellite weather forecasting, when for the first time a meteorological instrument was carried on board a satellite, Vanguard-2, which was launched on 17 February 1959. The satellite was developed by National Aeronautics and Space Administration (NASA) of USA. Unfortunately the images taken by the instrument could not be used as the satellite was destroyed while on mission. The first meteorological instrument that was successfully used on board a satellite was the Suomi radiometer, which was flown on NASA’s Explorer-7 satellite, launched on 13 October 1959. All these satellites were not meteorological satellites; they just carried one meteorological instrument.
The first satellite completely dedicated to weather forecasting was also developed by NASA [in conjunction with the Army, Navy, RCA, and the Weather Bureau]. The satellite was named TIROS-1 (television and infrared observation satellite) and was launched on 1 April 1960. It carried two vidcon cameras, one having low resolution and the other with a higher resolution. Both these cameras were adaptations of standard television cameras. Though the satellite was operational for only 78 days, it demonstrated the utility of using satellites for weather forecasting applications. The first picture was transmitted by the TIROS-1 satellite on 1 April 1960.
As “Satellite Technology” continues to tell the story, “Major non-geostationary weather satellite systems that have evolved over the years include the TIROS (television and infrared observation satellite) series and the Nimbus series beginning around 1960, the ESSA (Environmental Science Service Administration) series beginning in 1966, the NOAA (National Oceanic and Atmospheric Administration) series beginning in 1970, and the DMSP (Defense Meteorological Satellite Program) series initiated in 1965 (all from the United States).” There have been many, many others. The most useful overt weather satellites today are the GOES satellites.
The following is an illustration from the ICAS report 20b of July, 1977. It shows how the National effort in satellite remote atmospheric sensing between 1963-1979 evolved. As one can see, by 1980 Earth’s atmosphere was pretty well covered. We will define and explore ‘remote sensing’ shortly.
Historical availability of satellite data. Image source: The Interdepartmental Committee for Atmospheric Sciences
In order to monitor very high altitude atmospheric conditions, the National Aeronautics and Space Administration (NASA) launched and maintained early satellites such as the Eccentric Orbiting Geophysical Observatory, the Polar Orbiting Geophysical Observatory, the Atmospheric Structures Satellite, and later the Atmospheric Explorer satellite series.
Satellites have traditionally also been used as all-purpose remote weather data collection sites and pass-throughs. All types of weather data collected from ground stations, buoys, weather balloons, and aircraft, etcetera is sent wirelessly to satellites which then send the information on to ground-based data centers called “Earth stations.” We’ll have more about Earth stations shortly. Examples of this are pervasive throughout the weather modification literature. The earliest example of this as part of a weather modification effort yet found by the author is a 1974 report titled “Use of the ERTS-1 Data Collection System in Monitoring Weather Conditions for Control of Cloud Seeding Operations.”
It is interesting to note that one of the most prevalent rockets used to launch satellites and in soundings of the auroral electrojet was the V-2; a Nazi rocket originally designed to carry munitions. The V-2 was probably the first sounding rocket to help map the auroral electrojet. The most remarkable feature of the V-2 was in its programmable guidance system; it was the first “smart” rocket.
Have you heard of something called Operation Paperclip? This was the classified program which, after WWII, brought Nazi scientists into the American scientific establishment. The most famous of these scientists being the man behind the V-2; Wernher von Braun. In his book Hughes After Howard: The Story of Hughes Aircraft Company, Kenneth Richardson (a former Hughes executive) writes that another top executive at Hughes, Allen Puckett helped select Nazi scientists for Operation Paperclip. Hughes has been up to its eyeballs in this New Manhattan Project. Now one begins to see where all this is coming from.
There was something called the V-2 Panel; later called the Rocket and Satellite Research Panel. This was a group of scientists formed at Princeton University in 1946 who were interested in high-altitude rocket research. Their activities largely revolved around modifying the V-2 for atmospheric sounding use. Out of 9 seats, the panel included the famous scientist J.A. Van Allen and three board members from General Electric. The board’s chairman, E.H. Krause left in 1947 to go work on nuclear bomb tests. The New Manhattan Project connections to both G.E. and atomic bombs are extensive.
For more about the V-2 Panel, let us refer to Erik M. Conway’s Atmospheric Science at NASA: A History:
…this rather casual entity used many of the hundred or so V-2s assembled from parts collected in Germany at the end of the war for upper atmosphere research. They were primarily interested in investigating the ionosphere and the regions above it using new instruments and techniques that they devised for themselves. The members of the group were employed by a variety of universities, including Princeton, Johns Hopkins, Iowa State, and Harvard, and by military agencies, particularly the Naval Research Laboratory (NRL). Their funding came through various mechanisms from all three armed services, which sought better understanding of the upper atmosphere’s radio characteristics to improve radar and radio performance.
By 1965, our Nation had a proficient weather data gathering network. The Committee on Government Operations published a report outlining early weather data communication networks. On page 15 of “Government Weather Programs (Military and Civilian Operations and Research)” the authors write, “Meteorological operations and services consist generally of: (1) observations of weather conditions, by both Government and non-Government observers, from land stations, ships, and aircraft, by visual and electronic tracking of balloons, by telemetry from balloons, rockets, and automatic buoys and observation posts, by radar, and more recently by satellite; (2) communication of weather observations to processing centers principally by the teletype networks of the Federal Aviation Agency, the facsimile network of the Weather Bureau, and both teletype and facsimile networks operated by the U.S. Air Force; (3) the analysis of observational data and the formation of forecasts, either at primary centers such as the National Meteorological Center and the Meteorological Satellite Data Processing and Analysis Facility of the Weather Bureau at Suitland, Md. or the Global Weather Center at Offutt Air Force Base, Nebr., or locally at regional ‘guidance’ or ‘area’ centers as necessary for specialized users…” These types of networks have only improved since then.
Over the years, many large-scale, international programs such as: the 1969 Barbados Oceanographic and Meteorological Experiment, the 1969 Tropical Meteorological Experiment, the 1974 GARP (Global Atmospheric Research Program) Atlantic Tropical Experiment, and the 1978 First GARP Global Experiment have been conducted. These coordinated programs were executed in order to gain deeper understandings of our Earth’s atmospheric processes and involved all types of atmospheric monitoring activities, methods, and equipment. Participating Federal agencies included the usual suspects: the Department of Defense, the Department of Commerce, the National Aeronautics and Space Administration, the National Oceanic and Atmospheric Administration, the National Science Foundation, etc..
According to Erik Conway, “The Joint Organizing Committee of GARP voted itself out of existence in 1980. Or, rather, voted to transform itself into the Joint Scientific Committee of the World Climate Research Program. GARP’s goal had been twofold, to improve weather forecasting and to investigate the physical processes of climate; having done what seemed possible with the weather, its leaders turned to Earth’s climate. Interest in climate research had grown throughout the 1970s, in part due to NASA’s planetary studies and in part due to increasing evidence that humans had attained the ability to change climate by altering the chemistry of Earth’s atmosphere. NASA would never play as large a role in the World Climate Research Program (WCRP) as it had in GARP. But its role in climate science would far surpass its role in weather forecasting.” This high-level shift from weather forecasting to climate modeling began during the 1960s.
The 17th Interdepartmental Committee for Atmospheric Sciences report of May, 1973 detailed some of the Atomic Energy Commission’s atmospheric mapping activities. The report included a picture shown below:
Atomic Energy Commission airplane spraying a chemtrail. Image source: The Interdepartmental Committee for Atmospheric Sciences
The photo caption reads, “The smoke release in this photograph is one in a series of experiments being performed by the Meteorology Group of the Brookhaven National Laboratory for the Atomic Energy Commission in anticipation of the eventual siting of nuclear reactors at offshore locations. The dimensions and concentration of the smoke plume are being determined to assess atmospheric diffusion characteristics over the water. The onshore flow in this case, August 31, 1972, was due to a sea breeze circulation. The boat is about one mile off the coast of Long Island, New York.” The Atomic Energy Commission and the Energy Research and Development Administration conducted many atmospheric tracing experiments over the years.
The AEC has also monitored naturally occurring radioactive atmospheric particles; which can reveal a lot about atmospheric circulation patterns. Page 216 of “A Review of Federal Meteorological Programs for Fiscal Years 1965-1975” reads,
For many years the Atomic Energy Commission has operated a worldwide surface and upper air sampling program. A wide variety of radionuclides are measured along with other constituents. The data lead to improved understanding of the atmospheric general circulation and are used to build models for prediction of the local and global disposition of material released to the atmosphere.
In the nuclear area, the Atmospheric Release Advisory Capability (ARAC) system will include, in addition to the centralized base at Lawrence Livermore Laboratory, direct link-up with two major ERDA nuclear sites at Rocky Flats, Colorado and the Savannah River Laboratory, South Carolina. This is part of a long range plan to provide 24-hour atmospheric forecasting service to all major ERDA nuclear sites for use in case of nuclear incidents.
In 1977, the National Oceanic and Atmospheric Administration (NOAA) started writing about weather data collection networks in a different way. Under the heading of “Global Monitoring for Climatic Change,” NOAA writes:
The purpose of NOAA’s Global Monitoring for Climatic Change (GMCC) Program is to provide quantitative data needed for predicting climatic changes. These consist of (a) dependable measurements of existing amounts of natural and manmade trace constituents in the atmosphere, (b) determination of the rates of increase or decrease in these amounts, and (c) possible effects these changes may have on climate. The present U.S. network of baseline observatories consists of four stations located at Barrow, Alaska; Mauna Loa, Hawaii; American Samoa; and South Pole, Antartica. These stations are designed to supply information on atmospheric trace constituent concentrations using identical instrumentation and established measurement techniques. The GMCC Program is the U.S. portion of a planned world network of stations in the EARTHWATCH program of the United Nations Environment Program.
As evidenced by this passage, it was around this time that the dialog changed from “weather modification” and the “atmospheric sciences” to “global warming” and “climate change.” 1977 was also the publication year of a paper titled “On Geoengineering and the CO2 Problem” which employs the earliest mention of the word “geoengineering” known to the author.
So who is in charge of mapping the atmosphere for today’s New Manhattan Project? It may be the United Nations and the World Meteorological Organization (WMO).
A 2004 document titled “The Changing Atmosphere: An Integrated Global Atmospheric Chemistry Observation Theme” on the cover depicts an unmarked, all white jet airplane emitting visible trails and outlines a program for comprehensive and continuous global atmospheric observation. The authors of this report describe a system of atmospheric observation and analysis which would be sufficient for conducting today’s New Manhattan Project. Their proposed system involves ground based radar, aircraft-based measurements, satellite-based applications and a comprehensive data modeling system. This document was produced by the International Global Observing Strategy (IGOS) which is run by the United Nations’ WMO and the European Space Agency.
In this document, the authors define some of the WMO’s activities here. They write,
The WMO GAW [Global Atmosphere Watch] programme coordinates global ground-based networks measuring greenhouse gases, ozone, UV radiation, aerosols and selected reactive gases. GAW also combines the considerable monitoring activities conducted by member countries with those of global partners such as the Network for Detection of Stratospheric Change (NDSC), NASA SHADOZ, NASA AERONET, and regional networks such as EMEP, EANET and CAPMoN in order to extend coverage offered for these classes of compounds. The activities since the start with O3 in 1957 have included the establishment of the WMO member-supported GAW measurement stations, the Central Calibration Laboratories, the Quality Assurance/Science Activity Centres, World Calibration Centres and the five World Data Centre facilities, all under the guidance of scientific advisory groups of internationally recognised experts.
Remote sensing
Not to be confused with remote viewing, remote sensing is a primary way scientists gather atmospheric data. This paper has touched on it already. It means to gather information about an object or objects from a distance. Remote Sensing started with photography. Although there are many other applications, today the term is used to describe the New Manhattan Project’s necessary remote sensing spectrometers gathering detailed atmospheric data from thousands of miles away. While photographs only show reflected light, spectrometers are able to determine a target’s chemical composition. This is the best way of determining the NMP’s atmospheric concentrations of aluminum and circulations of barium.
The type of remote sensor most effectively used in today’s New Manhattan Project is something called an “active remote sensing atmospheric spectrometer.” “Active” means that the device sends out a signal which is reflected off the target and then bounced back to the device for analysis. This method allows atmospheric data to be collected at night and through cloud cover.
The weather modification and atmospheric sciences literature is replete with examples of remote sensing technology deployed both from ground based systems and satellites. In the New Manhattan Project, both are utilized.
As far back as 1960, ground-based radar (a type of remote sensing) has been used in the context of weather modification programs such as the New Manhattan Project. In a photo caption from their annual weather modification report of 1960, the National Science Foundation writes,
University of Chicago research efforts to identify and isolate physical processes associated with the production of rain in summer convective clouds make use of the radar facility shown below. The AN/TPS-10 range and height finding radar is installed 11 miles northeast of West Plains, Mo. Among the studies carried on are those seeking to discover how natural processes are modified as a consequence of seeding clouds with silver iodide.
The first satellite based atmospheric remote sensing observations were produced in the early 1960s. They came in the form of black & white, low-resolution television images produced by the TIROS satellites (there was a whole series of them).
Satellite remote sensing soon began to look like today’s technology. In the fourteenth ICAS report it is written:
Nimbus E and F are under development with planned launches of advanced meteorological sensors in 1972 and 1973. These satellites will make the first exploratory use of the microwave region of the spectrum for atmospheric sounding. Passive microwave and spectroscopy experiments will provide a means for determining vertical profiles of temperature and water vapor through cloud cover to the surface and also data on certain surface features. In addition advanced IR radiometers for sounding in cloudy regions for comparison with the microwave sensors will not only provide Earth radiation budget measurements but also improved spatial and spectral resolution for day/night imaging of water vapor clouds, cloud heights and surface features.
In their 1972 report to Congress and the President, under the heading of “weather modification,” the National Advisory Committee on Oceans and Atmosphere writes,
Significant progress has been made in recent years in satellite technology and in remote sensing from aircraft and from the ground. NOAA’s coming high resolution geostationary satellite and its developments in Doppler and optical radars and other remote-sensing techniques will make significant contributions to the advancement of the technology of weather modification. Satellites and remote sensing should be able to tell us something of the physical changes taking place within the seeded cloud and thus aid in the evaluation of field experiments.
In 1972 NASA reported, “Nimbus 5 was launched December 11, 1972, and has successfully met all its objectives. Included in the experiments was the first exploratory use of the microwave region of the spectrum for atmospheric sounding.”
The National Bureau of Standards (now NIST) has apparently been busily quantifying our atmosphere. Back in 1973, they were into some cutting edge remote sensing. Here is a passage from the 17th Interdepartmental Committee for Atmospheric Sciences report:
Experiments are now being undertaken to determine the vaporization rate of a liquid into the dense gas phase and to study homogenous nucleation processes using laser light scattering techniques. Continuing research activities with application to the atmospheric sciences involve the physics and chemistry of the atomic and molecular constituents of the atmosphere, its pollutants, and their behavior and mutual interactions including thermodynamic and kinetic properties, gas phase chemical reaction rates, collisional ionization processes, infrared and microwave spectroscopy, absorption phenomena, gaseous dissociation, molecular bonding and multimer growth (such as water multimers in clouds). NBS [National Bureau of Standards] is involved in the development of tunable lasers, interferometric methods, spectroscopic tools, and analytic procedures for measuring atmospheric conditions and detecting and monitoring the concentration of pollutants, their size distribution and chemical composition. Radioactive sources and calibration standards are also employed to trace the flow of nuclear effluents in the atmosphere.
As far as a remote sensing satellite constellation (group of satellites) goes, there is a pre-existing network of satellites which would be especially suited to the New Manhattan Project. Although this network bills itself as a global media telecommunications network, it would be relatively easy to covertly install capable atmospheric spectrometers. The network is called “Iridium.” It is particularly suited to the New Manhattan Project because it produces coverage, “…to all parts of the globe, including the polar regions.” Global coverage is what this global weather modification project requires.
Let us refer once again to the textbook Satellite Technology: Principles and Applications. On page 66 it reads,
One important application of LEO [low Earth orbit] satellites for communication is the project Iridium, which is a global communication system conceived by Motorola. A total of 66 satellites are arranged in a distributed architecture, with each satellite carrying 1/66 of the total system capacity. The system is intended to provide a variety of telecommunication services at the global level. The project is named ‘Iridium’ as earlier the constellation was proposed to have 77 satellites and the atomic number of iridium is 77. Other applications where LEO satellites can be put to use are surveillance, weather forecasting, remote sensing and scientific studies.
The New Manhattan Project C4 apparatus probably employs not only satellite-based remote sensors, but ground-based atmospheric remote sensors as well. The previously noted International Global Observing Strategy (IGOS) report of 2004 makes note of an in-use ground-based network of remote sensors employing, “…passive remote sensing spectrometers operating in various wavelength regions providing total column or low resolution vertical profiles of a number of atmospheric gases and aerosols; and active remote sensing lidar instruments for high-resolution remote sensing of atmospheric components.”
A global network of capable ground based atmospheric remote sensing facilities already exists. A group called the Network for the Detection of Atmospheric Composition Change (NDACC) is comprised of, “…more than 70 high-quality, remote sensing research sites for: observing and understanding the physical/chemical state of the stratosphere and upper troposphere” as well as, “…assessing the impact of stratospheric changes on the underlying troposphere and on global climate.”
Earth stations
Satellites require ground-based locations known as “Earth stations” to send and receive data. These are the giant satellite dish farms scattered around the Country and the world. These stations send and receive data to each other as well. They are hubs for all sorts of satellite and terrestrial communications. Although most of their activities pertain to media telecommunications rather than atmospheric data, any number of Earth stations may be involved in the New Manhattan Project. Digital data is digital data. A few locations are more probably involved than others.
The Canberra Deep Space Communications Complex may be involved. According to “Satellite Technology: Principles and Applications,” it is run by NASA’s Jet Propulsion Laboratory. NASA has been deeply involved in the New Manhattan Project. Not only that, but Canberra has been managed by those great purveyors of the New Manhattan Project; Raytheon. Further, they apparently had a big upgrade in the mid-1990s. The mid-1990s was when the chemtrails started. Further still, they have placed their antennas underground. Other evidence from Project Sanguine suggests that the New Manhattan Project employs sub-surface antennas.
The Kaena Point Satellite Tracking Station may also be involved. This facility is run by our U.S. military. Evidence suggests that the New Manhattan Project heavily involves certain questionable portions of the American military. The authors of “Satellite Technology” describe Kaena thusly,
The Kaena Point Satellite Tracking Station is a military installation of the United States Air Force (USAF) located at Kaena point on the island of Oahu in Hawaii. The station was originally established in the year 1959 to support the highly classified Corona satellite program. It is a part of Air Force satellite control network responsible for tracking satellites in orbit, many of which belong to the United States Department of Defense.
Lastly, there is an area of the United States incorporating portions of Virginia, West Virginia, and Maryland called the United States National Radio Quiet Zone. This 13,000 square mile area has been officially designated an area where radio transmissions are highly restricted and mostly relegated to two government radio transmission facilities located within the exclusion zone: the National Radio Astronomy Observatory and the Navy’s Sugar Grove station.
Among other restricted devices, it’s a cell phone free-zone. This is done to ensure the quality of the radio transmissions to and from the two government communications facilities. The multitude of electromagnetic signals commonly seen outside of this exclusion zone would interfere with the communications of these two government facilities. For the two large facilities operating here, this exclusion zone ensures secure and high-quality communications.
As far as the New Manhattan Project is concerned, it may be advantageous to utilize either or both of the two communications facilities within the United States National Radio Quiet Zone. The incredibly technical and secret New Manhattan Project requires communications to be both high-quality and secure. The Navy, which runs one of the installations in the exclusion zone, has one of the most extensive histories of involvement in the New Manhattan Project.
Commercial passenger airline atmospheric data reconnaissance
As part of the previously detailed atmospheric observational networks, the New Manhattan Project probably involves direct atmospheric sampling activities performed from aircraft. These activities include sampling of temperature, humidity, and atmospheric composition. As the airplane flies along through the atmosphere, specialized instruments collect atmospheric samples and take measurements. Direct atmospheric sampling from aircraft has historically been much more accurate than any remote sensing.
The aircraft performing this reconnaissance are probably the New Manhattan Project’s proprietary fleet of chemtrail spraying jet airliners along with commercial passenger airliners. For more about the New Manhattan Project’s proprietary fleet, see the author’s previous article “Death from Above: The New Manhattan Project Chemtrail Fleet.”
The weather modification literature provides us with information showing that commercial passenger airliners have historically been employed for the routine collection of atmospheric data. According to ICAS report number 20, as early as 1976, a system was being developed to, “…obtain real-time observations from airlines using the SMS/GOES satellites as a relay…”
Among other examples, National Science Foundation scientists published a 1983 paper detailing this. Under the heading of “Automated Aircraft Reports” the authors write:
Over 4,000 commercial aircraft reports are transmitted in real-time as part of the operational WWW [World Weather Watch] observational network. For FGGE, several special aircraft report programs supplemented these observations. The major special effort was called Aircraft Integrated Data System, or, AIDS. Approximately 80 aircraft (subsets of the DC-10, B-747 and Concorde fleets), equipped with AIDS, recorded winds and temperatures at spatial resolution of about 200 km along their flight path as functions of time and position. The data were recorded on cassettes which were processed later. The AIDS winds are obtained with the internal navigation system and are very accurate. A final set of 677,836 AIDS observations, or, roughly 60,000 per month, was compiled. Most of the reports were made at flight levels close to 250 mb (approximately 10 km); 747 Special Performance Aircraft reported at flight levels near 15 km and the Concorde fleet reported from levels up to 20 km.
A second program, developed by the U.S. especially for FGGE, was the Aircraft to Satellite Data Relay (ASDAR). This automated system was installed in 17 aircraft (16 B-747 passenger aircraft and 1 C-141 cargo carrier).
Collecting atmospheric data by use of commercial aircraft also makes sense in the context of the New Manhattan Project due to the fact that commercial airliners fly at significantly lower altitudes than the NMP’s chemtrail spraying proprietary aircraft. Thus, by the time the spray wafts down to the commercial airliners’ altitudes, the spray is appropriately diffuse and ready to provide a more widespread and representative sample. If coverage is adequate, commercial jets need not go out of their way to collect a sample. If the equipment aboard these commercial jetliners has been collecting aluminum, barium and strontium samples, then these records may provide supporting evidence for possible future litigation.
Atmospheric modeling
Atmospheric models are designed to predict the future. They have developed over the years as a way to produce weather forecasts. Current atmospheric observations are plugged into the model and the model presents what the model thinks will happen over the next hour, day, week, month, year, etc.. Longer term models are known as ‘climate models.’ When it comes to the use of atmospheric models in the context of weather modification programs such as the New Manhattan Project, scientists can foretell the effects of man-made atmospheric modifications such as spraying chemicals from airplanes and/or the use of electromagnetic energy.
Atmospheric models have always required ultra-massive raw computing power, hence supercomputers have an extensive history of being used for atmospheric modeling. Supercomputers bring the atmospheric models to life. Just as it is not a coincidence that every aspect of the New Manhattan Project evolved together in a coherent chronological order, it is not a coincidence that supercomputers have evolved simultaneously with the development of atmospheric models. All these things are interdependent.
But, before the supercomputers came, there were atmospheric models. In this development, atmospheric models are the horse that has pulled the supercomputer wagon, so we will delve into supercomputing in the next section. For a bit of early atmospheric modeling history, let us read from a passage in Erik Conway’s Atmospheric Science at NASA: A History. It all began in the late 1800s:
The most effective promoter of dynamical meteorology, as the nascent discipline was called, was Vilhelm Bjerknes, founder of the Bergen School of meteorology and pioneer of the highly successful (but still not physics-based) system of air mass analysis. Bjerknes had contended that the physical laws by which the atmosphere functioned were already known, in the principles of fluid dynamics and thermodynamics. He gathered many converts, including the first person to actually try to calculate the weather, Lewis Fry Richardson.
Just as with the New Manhattan Project itself, things didn’t really get going until the 1940s. A little later, Atmospheric Science at NASA continues:
The development of the digital computer in the final years of World War II permitted the meteorological community to revisit numerical weather prediction. John von Neumann, an already-famous mathematician at the Institute of Advanced Studies in Princeton, had collaborated on the ENIAC computer project and in the process had developed the logical structure that eventually formed the basis of all stored-program digital computers. He sought funding for such a machine for use in scientific research. He had been introduced to Richardson’s work by Carl Gustav Rossby at the University of Chicago during a meeting in 1942, and became interested in applying the digital computer to weather prediction in early 1946. After a meeting with RCA’s Vladimir Zworykin and the head of the U.S. Weather Bureau, Francis W. Reichelderfer, and more than a little enthusiastic advocacy by Rossby and the Weather Bureau’s chief of research, Harry Wexler, von Neumann decided to establish a meteorology project associated with the computer he was trying to build, the EDVAC.
The National Academy of Sciences went on to write:
Charney, Von Neumann, and Rossby had demonstrated shortly after World War II that the mathematical equations describing the physical processes that determine the large-scale atmospheric motion could be integrated satisfactorily on an electronic computer by finite-difference methods. Phillips successfully used this technique in 1956 to reproduce the main features of the global wind systems from an atmosphere at rest. An avenue had been opened to explore in a systematic manner the large-scale fluctuations in the air motion and assess the consequences of either conscious or inadvertent human intervention.
They were working on computer simulations of weather modification in the 1960s.
In the late 1940s, Jule Charney advanced the field by coming up with something called a “barotropic model.” This type of model was later perfected by Norman Phillips in 1952.
In 1954, something called the Joint Numerical Weather Prediction Unit was established to serve the Air Force Weather Service, the civilian Weather Bureau, and the Naval Weather Service. At this Joint Numerical Weather Prediction Unit, the aforementioned Norman Phillips demonstrated the first general circulation model (GCM) to accurately replicate, “…the seemingly permanent, large-scale structures of the global atmosphere, such as the jet stream and the prevailing winds. Researchers in several places began to conduct general circulation experiments after Phillips’s demonstration.”
The ICAS later wrote, “The development of GCM’s for climatic studies has proceeded mainly at five institutions. Two of the institutions are Federal laboratories: NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL) and NASA’s Goddard Institute of Space Science. The others are NCAR [National Center for Atmospheric Research], the University of California at Los Angeles, and the RAND Corporation.” When these general circulation models began to be developed was when man began to have a real understanding of Earth’s atmospheric circulations.
In 1963, the National Science Foundation (NSF) wrote of atmospheric models capable of demonstrating the result of, “…injecting energy directly into the atmosphere.” On page 25 of the NSF’s 1963 annual weather modification report, the authors write, “Finally, although details are not available, mention should be made of the imminent numerical experiment of Smagorinsky and his associates for the STRETCH computer of the U.S. Weather Bureau.”
“It is clear that with numerical models of the kind just described, experiments could be performed that would demonstrate the result of either modifying the earth’s surface in some specified way or of injecting energy directly into the atmosphere.” Injecting energy directly into the atmosphere is a distinguishing feature of the New Manhattan Project.
The NSF’s 1965 annual weather modification report states:
At NCAR [National Center for Atmospheric Research], an advanced mathematical model of the general circulation is under development by Akira Kasahara and his colleagues which will be used in a large computer to test methods of long-range prediction as well as to determine how changes in the general circulation might be achieved by various schemes of weather modification.
It just doesn’t get any clearer than that.
In the 1960s, geoengineers were studying atmospheric chemistry as it pertains to weather modification. Under the heading of “AEROSOL CHEMISTRY,” the 1965 NSF annual weather modification report states,
Research in this field has been relatively neglected until recently. A major portion of the NCAR [National Center for Atmospheric Research] research program is in this area–investigations of the life cycle of atmospheric aerosols, where they come from, how they are produced, what roles they play in the atmosphere, and how finally they are removed. Scientists at work on various aspects of this field at NCAR include Patrick Squires, Robert H. Bushnell, James P. Lodge, Jr., Edward A. Martell, Julian P. Shedlovsky, Farn Parungo, Jan Rosinski, and Arnold Bainbridge. Their studies cover development of techniques for sampling and analysis, especially at high altitudes; the microphysics and microchemistry of condensation and sublimation of water vapor on natural and artificial nucleants;…
The 1967 National Science Foundation annual weather modification report afforded us an update on the general status of atmospheric modeling development. Lawrence Livermore Laboratories had gotten involved! It states, “Based upon satellite observations and world weather data, several computer models to simulate the motions of the earth’s atmosphere on a global scale are now being constructed. Significant progress in formulating such models is being made at the National Center for Atmospheric Research, the Geophysical Fluid Dynamics Laboratory of ESSA, the University of California at Los Angeles with NSF support, and at the Livermore Laboratories of the University of California.” These computer models were developed under the Advanced Research Projects Agency’s (ARPA / later DARPA) Distributed Information Systems program division.
In 1974, a paper appeared in the proceedings of the Fourth Conference on Weather Modification titled “A Numerical Simulation of Warm Fog Dissipation by Electrically Enhanced Coalescence” which outlined an atmospheric computer model which factors in artificial electrical influences. This research is significant because it formed the basis for future weather modification by atmospheric heaters AND the New Manhattan Project which uses electromagnetic energy from ionospheric heaters to modify the weather.
From the mid-1970s to about 1990, the center of the National effort in atmospheric modeling shifted to Lawrence Livermore.
Lawrence Livermore National Laboratories has historically produced leading atmospheric models. For many years now, their Program for Climate Model Diagnosis and Intercomparison (PCMDI) has been producing cutting-edge atmospheric models. Specifically, the PCMDI describes their purpose as developing, “…improved and seamlessly interconnected methods for the diagnosis, validation and inter comparison of global climate models.”
The 1992 document “AMIP: The Atmospheric Model Intercomparison Project” by the Program for Climate Model Diagnosis and Intercomparison, Lawrence Livermore National Laboratory (LLNL) speaks to model improvements as a result of ongoing experiments comparing 29 atmospheric models from all of the world’s principle climate modeling groups. The findings were to be processed and archived at LLNL. In the ‘abstract’ section it states,
Organized by the Working Group on Numerical Experimentation (WGNE) as a contribution to the World Climate Research Programme, AMIP involves the international atmospheric modeling community in a major test and intercomparison of model performance; in addition to an agreed-to set of monthly-averaged output variables, each of the participating models will generate a daily history of state. These data will be stored and made available in standard format by the Program for Climate Model Diagnostics and Intercomparison (PCMDI) at the Lawrence Livermore National Laboratory. Following completion of the computational phase of AMIP in 1993, emphasis will shift to a series of diagnostic subprojects, now being planned, for the detailed examination of model performance and the simulation of specific physical processes and phenomena. AMIP offers an unprecedented opportunity for the comprehensive evaluation and validation of current atmospheric models, and is expected to provide valuable information for model improvement.
As to the origins of the PCMDI, a little later this same document reads,
… the Program for Climate Model Diagnosis and Intercomparison (PCMDI) was established at the Lawrence Livermore National Laboratory (LLNL) by the Environmental Sciences Division of the U. S. Department of Energy [DOE] for the purpose of increasing understanding of the differences among climate models. The support and implementation of AMIP quickly became a priority PCMDI activity. Since that time, substantial resources have been provided by the DOE for the support of AMIP, including the provision of computer time to participating modeling groups at the National Energy Research Supercomputer Center at LLNL. AMIP is also coordinated with the DOE Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program (Bader et al., 1992).
The 1997 document “The PCMDI Software System: Status and Future Plans” by Dean N. Williams and the Lawrence Livermore National Laboratory sheds more light on Lawrence Livermore’s Program for Climate Model Diagnosis and Intercomparison. It speaks to collaborations with the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. It states, “While playing a leading role in the development of climate diagnostic tools, PCMDI has entered into collaborative agreements with the National Center for Atmospheric Research (NCAR) and the Los Alamos National Laboratory (LANL) for the provision of atmospheric and oceanic diagnostic functions that will be utilized in PCMDI’s software products.”
A little later in “The PCMDI Software System: Status and Future Plans” more of LLNL’s cooperations with the National Center for Atmospheric Research (NCAR) are disclosed. On page 25 it states,
In July of 1995, PCMDI software team members met with Dave Williamson and Jim Hack of the National Center for Atmospheric Research, at which time the Visualization and Computation System (VCS) was demonstrated. As it was clear that PCMDI and NCAR would greatly benefit from collaboration, Dean Williams subsequently drafted a PCMDI-NCAR collaboration agreement. PCMDI’s part of the agreement was to provide NCAR with VCS, and to design and develop a calculator for climate analysis (now designated as CDAT). NCAR’s part of the agreement was to provide PCMDI with climate diagnostic routines.
“Long-Range Weather Prediction and Prevention of Climate Catastrophes: A Status Report,” a 1999 Lawrence Livermore National Laboratory report by Edward Teller, Ken Caldeira, Lowell Wood, et al. speaks to big developments in “global models.” The report states,
The USDoE and LLNL, via their advanced Environmental Science and Computing initiatives, are attempting to close the resolution gap between the regional and global models by supporting the development of the next generation of very high resolution global models; LLNL’s Atmospheric Science Division is currently using the NCAR (National Center for Atmospheric Research) CCM3 GCM as well as the NRL (Naval Research Laboratory) COAMPS Regional Prognostic Model for application studies in the area of global climate change and regional studies of the atmospheric transport and fate of hazardous materials. These models are being ported to the current ASCI [Advanced Strategic Computing Initiative] machines.
Today’s geoengineers may be using a climate model called the Community Climate System Model 3.0 (CCSM3) or something like it. The May, 2005 document “Global Biogeochemistry Models and Global Carbon Cycle Research at Lawrence Livermore National Laboratories” by Ken Caldeira et al. makes note of a, “…full physical climate system (atmosphere-ocean-ice-land) model.” They are writing about the Community Climate System Model 3.0 (CCSM3).
The 2006 document “Terrestrial Biogeochemistry in the Community Climate System Model (CCSM)” by Forrest Hoffman et al. makes note of the latest leading climate model and its producers.
The Community Climate System Model Version 3 (CCSM3) is a coupled modeling system consisting of four components representing the atmosphere, ocean, land surface, and sea ice linked through a coupler that exchanges mass and energy fluxes and state information among these components. CCSM3 is designed to produce realistic simulations of Earth’s mean climate over a wide range of spatial resolutions. The modeling system was developed through international collaboration and received funding from the National Science Foundation (NSF) and the Department of Energy (DOE) as well as support from the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA). A portion of DOE’s support for CCSM has been through SciDAC Projects, including the multi-laboratory Climate Consortium Project headed by Phil Jones and John Drake.
If you are wondering… yes, that is Phil Jones of Climategate infamy.
If you will recall, Climategate was the hacked email scandal that caught the people responsible for the IPCC’s climate data with their pants down. Yeah, that was where they were caught explaining how they cook the data, keep opposing viewpoints out of the official discussion, and practice other decidedly unscientific activities. It appears that the spirit of the pre-scientific era of weather modification lives on.
Supercomputers
Supercomputers have historically been manufactured by corporations like: IBM, Cray, Control Data Corporation, Texas Instruments, Burroughs, and the Univac Division of Sperry Rand. Today’s big dogs in the space are IBM and Cray.
Supercomputers by nature are very large. A supercomputer can range from a single machine about the size of a telephone booth (if anyone remembers what those are), to a large room filled with scores of big, hot boxes. Over the years supercomputers have, of course, become much faster, but their size remains a constant.
Oddly enough, the earliest examples of supercomputer atmospheric simulations have connections to the original Manhattan Project. In an interview conducted not long before his death, well known MIT meteorologist Jule Charney (1917-1981) spoke of the famous mathematician John von Neumann (1903-1957) working on atmospheric supercomputing in the mid to late 1940s, after he had worked on the H-bomb. He said:
…Zworykin [Vladimir] was then with GE . . . RCA labs . . . and, um, Zworykin was interested in the meteorological problem. And I think it very likely that through their friendship that von Neumann took up the meteorological problem as sort of par excellence a problem for a large computer. Um, but previous to that, he had become interested in hydrodynamical problems in relationship to the Manhattan Project and the design of, uh, implosion devices and such things. And later on I think he did some hydrodynamical work in connection with the H-bomb. And I know that . . . that when the Prin . . . the computer at the Institute for Advanced Study was completed, one of the first, uh, problems was a highly classified problem . . . it concerned the H-bomb.
The Princeton Institute for Advanced Study was where John von Neumann pioneered mathematical atmospheric modeling.
Founded at the site of an old WWII Naval air station, Lawrence Livermore Laboratories has been involved in supercomputing since their 1952 inception. In that year they purchased their first supercomputer; a Sperry Rand UNIVAC 1 delivered in April of 1953. Their programmers first focused on computerized simulations of atomic bomb detonations, then, later the focus became atmospheric modeling. In the early to mid 1950s, Livermore Labs was the premiere buyer of early supercomputers. Let us refer to a passage from The Supermen: The Story of Seymour Cray and the Technical Wizards behind the Supercomputer. It reads:
For computer manufacturers the needs of the bomb builders created an incredible opportunity. Throughout the early and mid-1950s, Livermore and Los Alamos stepped up their computing efforts until a friendly competition formed between the two labs. They vied for prestige; they vied for funding; they vied for access to the first of every kind of computer. For both labs computers emerged as status symbols, much as they had for giant corporations such as General Electric.
Although others such as the National Center for Atmospheric Research (NCAR) and the Department of Defense were prolific supercomputer buyers, elsewhere in The Supermen, Lawrence Livermore Labs is described as, “…the leader among industry users. When Livermore purchased a new machine, the other government labs took notice.”
Others such as DARPA, the Department of Defense, and NASA have historically been supercomputer buyers and industry supporters. All these organizations have been deeply involved in the New Manhattan Project. Atmospheric modeling has always been a supercomputer priority.
By the mid-1960s, supercomputers were standard equipment. The Committee on Government Operations tells the story in 1965, “High speed digital computers have been used in meteorological data analysis and weather prediction for nearly 10 years. Until 1955 weather data, gathered daily from all over the world, were sorted and processed almost entirely by hand and then entered manually on charts. Furthermore, until recently the job of forecasting was largely a combination of the forecaster’s experience, certain statistical relationships, and qualitative or semi qualitative physical reasoning. Today, a large portion of the routine data handling and processing is performed by computers, and certain types of weather forecasts are now produced automatically by these machines. Computers are also used extensively by certain of the Nation’s atmospheric science research laboratories such as the Weather Bureau’s Geophysical Fluid Dynamics Laboratory and the National Center for Atmospheric Research.”
The document goes on to further state, “The National Meteorological Center at Suitland, Md., has a large scientific computer, the IBM-7094 II, and two smaller IBM-1401 computers. This equipment, together with the National Weather Satellite Center’s computer complex, which also leases an IBM-7094 II, make the Weather Bureau’s Suitland facilities the largest meteorological computing center in the world.”
With these multi-million dollar machines, there may have been some chicanery going on. In 1976 congressional testimony, author Lowell Ponte writes of early atmospheric supercomputing. He writes, “Defense Department climate research coalesced in the early 1970s in a DARPA project called ‘Nile Blue.’ In fiscal 1972 the project had a budget of $2.587 million to develop and monitor computer ‘models’ of changes in world climate. For this purpose it was provided the mot sophisticated computer yet developed, the ‘ganged’ ILLIAC IV system,…”
A little later Mr. Ponte continues:
…Nile Blue changed its name to ‘Climate Dynamics’ and DARPA, under whose control it remained, announced that henceforth the program was completely open and unsinister. In 1973, in a seemingly unrelated event, Dr. Lawrence W. Gates announced that the Ames Research Center in California had just acquired a new ILLIAC IV ‘ganged’ computer from DARPA, which he would be using for world climate modeling. Dr. Gates is head of climate research at the RAND Corporation in Santa Monica, California. The RAND Corporation was established in 1948, at the beginning of the ‘Cold War’ between the United States and Soviet Union, to serve as a civilian intelligence arm of the Department of Defense. Now diversified, it still does classified military studies for the Pentagon. As a private organization, the RAND Corporation is not subject to the same degree of congressional scrutiny as the Department of defense.
In 1972 congressional hearings before Senator Pell’s subcommittee, the Defense Department said it was conducting no classified weather research. Climate research was unmentioned.
Today, most supercomputers are connected to each other via the Internet. It has been this way for a long time. The early phone and Internet connections (coupled with glacial computers) were agonizingly slow by today’s standards. Until the late 1980s, sending information between supercomputers was a laborious task. The authors from NCAR tell the story:
NCAR’s computers are used not only by scientists at the center but by researchers from across the country who need the most sophisticated computational resources available. Since the mid-’70s, they have been able to connect to NCAR’s computers from their home institutions through telephone links, but this method has been frustrating and often unsatisfactory. Scientists have had a hard time learning the NCAR computer systems from their remote locations, and critical tasks such as editing their files over the remote links have been difficult. They have had to continually redial NCAR’s computers to see whether their work was completed. If the job output was large, they have had to rely on the U.S. mails to send it back, because large data and graphics files were almost impossible to transfer using telephone links.
This year all of that changed, thanks to an innovation known as the Internet Remote Job Entry (IRJE) system. IRJE is a method by which remote users of NCAR’s supercomputers can use a networking system to submit jobs directly from their local computers and and them returned directly.
Central to IRJE is the Internet, a collection of campus, regional, and wider-area computing networks that use a common protocol called TCP/IP. The National Science Foundation’s ‘network of networks,’ NSFNET, is at the core of the Internet. It ties smaller networks to each other and to supercomputing centers including NCAR. Thanks to these connections and to NCAR’s own satellite network, IRJE has been able to progress from a good idea to a working and popular tool.
Virtually any atmospheric researcher in the country can now submit a job directly to NCAR’s computers using simple commands. The off-site computer can be anything from a personal computer to a minicomputer or a mainframe, as long as it is linked in some way to an appropriate network. The researcher can sit at a desk and prepare a problem, send it to NCAR even from thousands of miles away, and get the results back automatically and quickly. He or she can edit files and view data output and graphics at the remote site. The technology by which this is done, developed by Britt Bassett of NCAR’s Scientific Computing Division, is elegantly simple from the user’s perspective. It requires very little knowledge of NCAR’s systems, and it is ‘invisible’ to users, who are unaware of the maze of connections between them and the SCD computers.
Today, virtually the entire research community has access to the IRJE. Consequently, university scientists can make progress much faster. The simplicity of the system allows them to concentrate on science by minimizing the need to learn new computer procedures and eliminating the time needed to track work’s progress. Files as large as ten novels are routinely sent back and forth. Because such sizable amounts of work, including graphics, can now be delivered to the scientist quickly and easily, research problems that were not previously attempted off-site are done with ease. Coupled with the networks that have made it possible, the IRJE system marks the beginning of a new era for remote users of NCAR’s computing facilities.
The Accelerated Strategic Computing Initiative (ASCI) at Lawrence Livermore National Labs has historically been at the forefront of atmospheric supercomputing. With its roots in Lawrence Livermore’s extensive history of nuclear bomb detonation computer simulations and artificial intelligence experimentation, the ASCI evolved from the Defense Advanced Research Projects Agency’s Strategic Computing Initiative. The ASCI began operating in the early 1990s.
“The History of the Accelerated Strategic Computing Initiative” prepared by Alex Larzelere for the Lawrence Livermore Lab outlines the ASCI’s purpose. On page 185 the document states “Deep understanding and simulation will be essential as the U.S. and the international community develops new systems to monitor, analyze and predict climate change, the effects of climate change and the efficacy of climate change mitigation measures. This increased understanding will directly feedback into the DOE’s [Department of Energy] – and others – energy development efforts: an integrated climate and energy system.” As we have seen, ‘climate change mitigation measures’ is code for the New Manhattan Project.
So what comprises today’s New Manhattan Project supercomputing effort? Although NCAR and LLNL both have some of the world’s fastest supercomputers, all the world’s supercomputers have been connected via the Internet for a long time now. The supercomputing power available to the command centers of the New Manhattan Project is probably the sum of all the world’s supercomputers put together. Many other parties share that computing power, but there is probably still plenty left to run the NMP.
Letting the broader Internet and World Wide Web contribute to their computing power would probably not be advantageous. Sharing computing power over the Internet is commonplace today. But the secret and high-tech nature of the NMP probably precludes broader participation because letting everybody share could create security and technical issues. Lastly, a supercomputer-only network makes sense because the deadly serious NMP demands a high level of security and professionalism which a network of institutionally owned and administered supercomputers affords.
Command Centers
Like classical music conductors, today’s top geoengineers must bring together all the members of their orchestra. This ultra-massive Project requires a group of people running it as a team. These teams require facilities capable of running the New Manhattan Project. There aren’t too many locations in the Country or in the world that might qualify.
Evidence and logic suggest that the command and control centers of the New Manhattan Project employ large holographic displays. These displays probably produce very realistic three dimensional imagery in a theatre-in-the-round type of setting viewable by groups of lead scientists. This would be the best way to display the information and allow for discussion. Technicians can more easily respond to orders from the lead scientists when they are all looking at the same thing.
A NMP command center probably looks a little something like this:
Theater in the round. Image source: Made in Shoreditch
The background walls would be plastered with computer screens showing all kinds of atmospheric information. In the rows would be comfortable recliners with power and USB connections for laptop computers. The hanging lights are where the holograms would appear.
Holograms go all the way back to 1948 with Dr. Dennis Gabor (1900-1979) and his Nobel Prize. Holographic movies have been around since the 1960s.
In “The Electronics Research Center: NASA’s Little Known Venture into Aerospace Electronics” by Andrew Butrica, Ph.D, the author writes of holographic displays being developed at NASA’s Cambridge, Massachusetts Electronics Research Center in the late 1960s. This facility today is known as the John A. Volpe National Transportation Systems Center.
As early as 1964, our Air Force was recreating atmospheric volumes in a holographic fashion. Page 49 of the National Science Foundation’s seventh annual weather report states, speaking of their laser disdrometer, “By reconstructing the sample volume in three dimensions from the two-dimensional diffraction pattern film record, called a hologram, the size, shape, and relative position of each particle in the original volume are determined.”
In the Summer of 1968, as previously noted, the NSF reported the Bureau of Reclamation’s successful operation which produced three dimensional atmospheric models. Any image that can be stored and reproduced in three dimensions can be reproduced as a hologram.
Many years ago our Army reported that they were collecting atmospheric data in a fashion that would lend itself to being displayed on a holographic display. This is what today’s geoengineers are probably doing. Way back in 1975 the Army reported,
…developments in microwave radar, laser radar, passive radiometry, and acoustic sounding have demonstrated that it is now feasible to consider real-time atmospheric sensing systems for the combat Army. Remote sensing permits the measurement of relevant parameters of the atmosphere in one, two, or three spatial dimensions, all as a function of time…
An alternative to this suggested use of holography would be the group usage of synchronized virtual reality headsets. This scenario would be advantageous because it would preclude the possibility of clandestine filming of any holographic imagery.
Now we get down to the last locked dungeon door. It’s dark and dank. Green slime and black grime covers the huge blocks of stone walls. Rats scurry along in the corner. The smell of rotting flesh permeates the air. We can hear the low growling, the howls of pain, and the insane, echoing laughter.
Where are they? Where have our tormentors been hiding? Who has been laughing at our pain?
Let us use our key that has been fashioned through so much scratching and clawing research. Now we slide the key into the lock, noting the exact fit. It turns snugly. The portal is open. There is no going back now. The spirits beckon us. Let us pass from ignorance into wisdom.
The most probable location for a New Manhattan Project C4 center is Lawrence Livermore National Labs in Livermore, CA.
Conclusions
There may be a lot of unknowns here. It would be relatively easy for the highly compartmentalized military/industrial complex to secretly build and launch a satellite or series of satellites. A document titled “Space Weather Architecture Transition Plan” by the Department of Defense mentions a ‘classified’ satellite program operating between 1998 and 2008. Many Earth stations may be hiding secret equipment and covert functions. Supercomputers can be heavily encrypted. All the potential command centers are under ultra-tight security. A very small portion of the population is aware of any of this. All the components of the New Manhattan Project command and control apparatus either can be or are under a veil of secrecy.
Despite the many possible vagaries here, the evidence also suggests some definite probabilities.
When searching for the large scientific facilities necessarily housing the people and equipment running the New Manhattan Project, there aren’t too many locations in the world that have anything resembling the massive technological resources necessary. As some of the very few locations capable, the evidence leads firstly to Lawrence Livermore National Laboratory (LLNL) and secondly to the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. There may be a third command center in the East.
LLNL has a grand history of conducting classified scientific projects. LLNL has been deeply involved in the design and manufacture of nuclear weapons. Nuclear weapons were originally produced by the first Manhattan Project. Strangely enough, nuclear weapon development largely morphed into the atmospheric sciences.
The Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore Labs is probably running the New Manhattan Project’s atmospheric modeling efforts. The PCMDI was founded by the Department of Energy who has extensive ties to the New Manhattan Project. The Department of Energy used to be the Atomic Energy Commission and we have seen all the connections there.
The Accelerated Strategic Computing Initiative (ASCI) at Lawrence Livermore Labs is probably running the New Manhattan Project’s supercomputers. The ASCI evolved from a Defense Advanced Research Projects Agency (DARPA) program and did work for the Department of Energy (DOE). It looks like DARPA and the DOE have been running the New Manhattan Project all along.
Specifically, it looks like LLNL’s New Manhattan Project participation can be found in their Atmospheric, Earth, and Energy Division.
LLNL’s collaborations with NCAR are incriminating because these investigations have shown that NCAR has been instrumental in the New Manhattan Project as well. NCAR has always had similar equipment and conducted similar activities to those of LLNL. The main difference between LLNL and NCAR is that NCAR specializes in the atmospheric sciences while LLNL does not.
NCAR’s location near the geographical center of the contiguous United States would provide a more convenient location for geoengineers who live in the central regions of the country. NCAR is located on a mesa at the foot of the Rockies. It’s pretty well known that the facility is protected by being deeply embedded in the Earth; much like NORAD which is further south and built into the same geological formation.
For this reason of convenience, there may also be a third C4 center in the East. The prime candidate is Wright-Patterson Air Force Base. As these investigations have shown, Wright-Patterson has some of the most extensive ties to the New Manhattan Project.
Now we have a good idea of from whence the New Manhattan Project is being conducted. Somebody is doing it. At least one C4 center has to be somewhere. The most probable location is Lawrence Livermore National Labs.
The results of this investigation may be very important to potential future litigation. If we know where the workplaces of the people running the New Manhattan Project are located, then we can legally and lawfully shut those locations down. If we successfully shut one or more of these centers down, then we may also consequently gain access to treasure troves of hard evidence.
It appears that the entire scientific apparatus supposedly operating to monitor and analyze anthropogenic climate change is in fact the practicing apparatus which monitors, analyzes, commands, and controls the atmosphere for the New Manhattan Project. These systems evolved from the New Manhattan Project’s roots in weather modification. These systems were created in chronological lock-step with the New Manhattan Project’s other developments. It’s all much too big to hide. A cover story as big as the Project was created. These people have already been caught lying during Climategate.
Guess what… the establishment lied to us. The theory of man made global warming is designed to sell an incredibly gigantic, mass-murderous rip-off. It’s designed to make us all accept a lower quality of life. It’s much bigger than the oil companies. It is the justification for the New World Order and population reduction. If you aren’t aware of this, please come to your senses, just as your author has, and admit we were wrong. Let’s just admit it so that we can better shut down the New Manhattan Project and move on with life. We cannot live under this.
The theory of man-made global warming is cut whole cloth from the establishment. They have crafted an exquisite deception for us. Most people have only adopted it because of unbelievably subversive and immense public relations campaigns. We have been ruled by the most devilishly cunning minds; have no doubt about that, my friend.
Earth’s climate fluctuates naturally. It always has. It’s driven by the Sun. It’s not our responsibility.
No, man is not significantly changing the climate by emitting CO2, but with the New Manhattan Project he most certainly is. Their cover story for this monstrosity is false. Please see through this deception and help stop the spraying now. Thank you.
Peter A. Kirby is a San Rafael, CA researcher, author and activist. Check out his ebook Chemtrails Exposed: A new Manhattan Project. Follow him on Twitter @PeterAKirby.
Notes
Planet Earth: The Latest Weapon of War a book by Rosalie Bertell, published by Black Rose Books, 2001
Atmospheric Sciences at Nasa: A History a book by Erik M. Conway, published by The Johns Hopkins University Press, 2008
Introduction to Remote Sensing: Fifth Edition a book by James B. Campbell and Randolph H. Wynne, published by The Guilford Press, 2011
“Government Weather Programs (Military and Civilian Operations and Research)” a report by the Committee on Government Operations, published by the U.S. Government Printing Office, 1965
Beyond the Atmosphere: Early Years of Space Science a book by Homer E. Newell, published by Dover Publications, 2010
“Chemical Releases in the Ionosphere” a paper by T. Neil Davis, published by the Institute of Physics, 1979
Rockets Over Alaska: The Genesis of the Poker Flat a book by Neil Davis, published by Alaska-Yukon Press, 2006
“Observations of the Development of Striations in Large Barium Ion Clouds” a report by T. N. Davis et al. for the Rome Air Development Center, Advanced Research Projects Agency, published by the National Technical Information Service, 1972
The Interdepartmental Committee for Atmospheric Sciences (ICAS) reports 1960-1978, published by the Federal Council for Science and Technology
Satellite Technology: Principles and Applications, Second Edition a book by Anil K. Maini and Varsha Agrawal, published by John Wiley and Sons, 2011
“National Environmental Satellite, Data, and Information Service, 1996-97” a report by the United States National Environmental Satellite, Data, and Information Service, 1997
“Use of the ERTS-1 Data Collection System in Monitoring Weather Conditions for Control of Cloud Seeding Operations” a report by Dr. A.M. Kahan, published by the Bureau of Reclamation, 1974
“Inquiry into the Feasibility of Weather Reconnaissance from a Satellite Vehicle” by S.M. Greenfield and W.W. Kellogg, published by the Rand Corporation, 1960
Operation Paperclip: The Secret Intelligence Program that Brought Nazi Scientists to America a book by Annie Jacobsen, published by Little, Brown and Company, 2014
Hughes After Howard: The Story of Hughes Aircraft Company a book by D. Kenneth Richardson, published by Sea Hill Press, 2012
“A Review of Federal Meteorological Programs for Fiscal Years 1965-1975” a report by Clayton E. Jensen, published by the Federal Coordinator for Meteorological Services, 1975
“On Geoengineering and the CO2 Problem” by Cesare Marchetti, as published in Climatic Change, 1977
“The Changing Atmosphere: An Integrated Global Atmospheric Chemistry Observation Theme” a report by the International Global Observing Strategy, 2004
National Science Foundation Weather Modification Annual Reports 1959-1968
“National Advisory Committee on Oceans and Atmosphere first annual report to the President and the Congress” 1972 as it appeared in the congressional record of testimony by Dr. Alfred J. Eggers Jr., NSF during hearings before the Subcommittee on the Environment and the Atmosphere of the Committee on Science and Technology, U.S. House of Representatives, Ninety-fourth Congress, second session, 1976
“Evaluation of Iridium Satellite Phone Voice Services for Military Applications” a report by Caroline Tom and Lyle Wagner, published by the Defence Research Establishment, Ottawa, 1999
“Exploring the Interface Between Changing Atmospheric Composition and Climate” a report by the Network for the Detection of Atmospheric Composition Change, 2011
“The Global Atmospheric Research Program: 1979-1982” a paper by Jay S. Fein and Pamela L. Stephens, as published in Reviews of Geophysics and Space Physics, vol. 21, no. 5, p 1076-1096, June 1983
“Weather & Climate Modification: Problems and Progress” a report by the National Academy of Sciences, 1973
“A Numerical Simulation of Warm Fog Dissipation by Electrically Enhanced Coalescence” a paper by Paul M. Tag as it appeared in the proceedings of the American Meteorological Association Fourth Conference on Weather Modification, 1974
“The PCMDI Software System: Status and Future Plans” a report by Dean N. Williams for the Program for Climate Model Diagnosis and Intercomparison, published by Lawrence Livermore National Laboratory, 1997
“AMIP: The Atmospheric Model Intercomparison Project” a report by W. Lawrence Gates for the Program for Climate Model Diagnosis and Intercomparison, published by Lawrence Livermore National Laboratory, 1992
“Long-Range Weather Prediction and Prevention of Climate Catastrophes: A Status Report” a report by E. Teller, K. Caldeira, G. Canavan, B. Govindasamy, A. Grossman, R. Hyde, M. Ishikawa, A. Ledebuhr, C. Leith, C. Molenkamp, J. Nuckolls and L. Wood, published by Lawrence Livermore National Laboratory, 1999
“Global Biogeochemistry Models and Global Carbon Cycle Research at Lawrence Livermore National Laboratories” a report by C. Covey, K. Caldeira, T. Guilderson, P. Cameron-Smith, B. Govindasamy, C. Swanston, M. Wicjett, A. Mirin and D. Bader, published by Lawrence Livermore National Laboratories, 2005
“Terrestrial Biogeochemistry in the Community Climate System Model (CCSM)” a report by Forrest Hoffman, Inez Fung, Jim Randerson, Peter Thornton, Jon Foley, Curtis Covey, Jasmin John, Samuel Levis, W. Mac Post, Mariana Vertenstein, Reto Stöckli, Steve Running, Faith Ann Heinsch, David Erikson, and John Drake as published in the Journal of Physics: Conference Series 46, 2006
“Conversations with Jule Charney” a report by George W. Platzman, published by the Massachusetts Institute of Technology and the National Center for Atmospheric Research, 1987
(Lawrence Livermore National Laboratory: Webster’s Timeline History 1941-2007 a book by Professor Philip M. Parker, Ph.D., published by ICON Group International, 2009)
The Supermen: The Story of Seymour Cray and the Technical Wizards behind the Supercomputer a book by Charles J. Murray, published by John Wiley & Sons, Inc., 1997
The ILLIAC IV: The First Supercomputer a book by R. Michael Hord, published by Computer Science Press, 1982
“Hearings Before the Subcommittee on the Environment and the Atmosphere of the Committee on Science and Technology, U.S. House of Representatives, Ninety-fourth Congress, second session,” 1976
National Center for Atmospheric Research annual report 1988
“The History of the Accelerated Strategic Computing Initiative” a report by Alex R. Larzelere II, published by Lawrence Livermore Laboratory, 2009
The Complete Book of Holograms: How They Work and How to Make Them a book by Joseph E. Kasper and Steven A. Feller, published by Dover Publications, Inc., 2013
“The Electronics Research Center: NASA’s Little Known Venture into Aerospace Electronics” a paper by Andrew Butrica, Ph.D, published by the American Institute of Aeronautics and Astronautics, Inc., 2002
“Space Weather Architecture Transition Plan” a report by the Office of the Assistant Secretary of Defense for Command, Control, Communications, and Intelligence, 2000
Websites
pfrr.alaska.edu
darpa.mil
energy.gov
ge.com
sri.com
raytheon.com
nasa.gov
noaa.gov
princeton.edu
jhu.edu
iastate.edu
harvard.edu
navy.mil
airforce.com
defense.gov
commerce.gov
nsf.gov
wcrp-climate.org
bnl.gov
llnl.gov
sml.doe.gov
un.org
wmo.int
esa.int
uchicago.edu
nist.gov
motorola.com
rca.com
weather.gov
nasonline.org
ncar.ucar.edu
ucla.edu
rand.org
lanl.gov
ibm.com
cray.com
ti.com
burroughs.com
web.mit.edu
volpe.dot.gov
usbr.gov
army.mil
Absolutely spectacular article and information, thank you.
LLNL, DARPA, DOE… need I say more?
Please up vote comments from real people and down vote those left by the shills.
To King Leonidas and the brave 300!!!
Very well researched and written. Thank you, Peter Kirby.
As an aside, in case some aren’t aware of Dr. J. Marvin Herndon’s recently published study on geoengineering residues here’s a link: greenmedinfo.com/blog/why-chemtrail-conspiracy-real
Perhaps our planet is being terraformed for the NEXT generation of colonists. The parallel between UFO secrecy and Geoengineering secrecy is uncanny…
Yeah, maybe the establishment is preparing us for a fake alien invasion a la Project Blue Beam. It’s been reported that the Federal Reserve Bank has been nationalized. If that’s true, then these inbred death cultists are desperate. A fake alien invasion might be their last, madcap attempt to maintain Western domination.