Time and again, technological advances have embarrassed the most astute planners. While we can think ahead toward the advanced aerospace vehicles of the future, it is almost a certainty that breakthroughs in flight technology and/or wind tunnel capabilities will make today's best thinking obsolete. Remember the surprises of Sputnik, the Area Rule, the blunt entry shape, and the supercritical airfoil. Nevertheless, the future must be faced rationally, and some general observations are in order.
First in importance is the fact that whatever happens, NASA's wind tunnel complex, which has a current replacement value upwards of $1 billion, will best serve the future if it possesses a broad base of capabilities. Second, the average age of existing major NASA tunnels was roughly 25 years in 1980. The recently refurbished Langley full scale wind tunnel will celebrate its 50th birthday in 1981. Even the big Unitary Plan tunnels, usually considered  modern, had been in operation for 25 years in 1980. America's wind tunnel inventory is aging and there is no room for complacency.
The third generality concerns the increasing reliance on wind tunnels in aircraft design development. Whereas the venerable DC-3 needed only about 100 hours of tunnel time, the B-52 bomber took 10 000 hours. By the time the first Space Shuttle flew, it had accumulated 100 000 hours in wind tunnel time. New aircraft are becoming more complex, with demands for increased speed, altitude, temperature, and overall size and weight. Deficiencies in design are incredibly expensive to correct in production models. It is no wonder that engineers rely more and more on wind tunnel testing early in the development cycle.
Happily, there are compensating factors. Thanks to the electronic revolution, wind tunnel controls are better, and there is much more automation of instrumentation and data gathering. A tunnel hour today is much more productive than it was a few years ago.
A more subtle observation is that today's bigger tunnels better simulate the actual Reynolds numbers encountered in full scale flight. This circumvents the laborious building of a fund of experience at reduced Reynolds numbers (often in several different tunnels) and subsequent time consuming and often questionable extrapolation to full scale conditions.
Aerodynamicists look forward to the future; they now speak of electronic wind tunnels. What they really mean is that aerodynamic theory has improved considerably and electronic computers have more than kept pace so that the mathematical prediction of the performance characteristics of aircraft and their components is much more accurate. Not only can simple aircraft components be studied in depth without recourse to the wind tunnel but, in some situations, complete vehicle configurations. Real wind tunnels, of course, will be called on for research, validation of calculations, and performance assessment where theory falters.
The number of wind tunnel tests required for new aircraft has risen by several orders of magnitude during the past half century.
For almost half a century, the transonic regime of flight has preoccupied aerodynamicists. Almost all modern commercial, military, and aerospace craft fly near, in, or through the transonic regime. True simulation of full scale transonic Reynolds numbers did not become possible until a complete break was made with conventional wind tunnel design in 1973, when the NASA 0.3 meter transonic cryogenic tunnel went on line at Langley. It was cost that deterred the construction of full scale conventional wind tunnels in the transonic regime. If either high pressure or large size were used to achieve full scale Reynolds numbers, the cost of the tunnel shell and drive equipment would have been prohibitive. The key, as the 0.3 meter cryogenic tunnel proved, was decreasing air temperatures and therefore the viscosity factor in the denominator of the Reynolds number.
The national need for a big transonic tunnel was recognized in the 1960s, and extensive studies of various alternatives began in 1966. They all ran up against the brick wall of high cost until the cryogenic option was proven feasible in 1973. NASA exploited its cryogenic success immediately by proposing a 2.5 meter cryogenic transonic tunnel. At this time, the U.S. Air Force was considering an intermittent high pressure Ludwieg tube tunnel to meet its transonic test requirements. Rather than build both expensive facilities, the Federal Government decided in 1974 to construct a single National Transonic Facility (NTF) at Langley, based on NASA cryogenic developments, to serve all U.S. commercial, military, and scientific requirements. The NTF should come on line in 1982 at a total cost of $85 million-the....
The circuit diagram of the National Transonic Facility.
 ...most ambitious and expensive wind tunnel ever built.
Langley razed the old 4 foot supersonic pressure tunnel to make room for the NTF. The drive motors, buildings, and cooling towers were spared (saving $20 million) and became an integral part of the new tunnel. The NTF circuit arrangement does not appear revolutionary; it is a single return and fan driven tunnel, with a 2.5 meter slotted wall test section. Conventionality ends there. The 120 OOO horsepower electric drive includes a two speed gear turning a fan incorporating controllable pitch. In the tunnel itself, test section isolation valves will be installed. Shell pressures will vary from a near vacuum to 9 atmospheres, while test gas temperatures range from 300° to 175° F. The NTF will operate continuously in one of two modes: the cryogenic mode, in which up to 1200 pounds per second of liquid nitrogen will be injected and gasified and a conventional, noncryogenic mode using air as the test medium. Although the tunnel can operate continuously in both modes, the cost of the cryogenic mode is very high, but no higher than that of noncryogenic tunnels operating at equivalent test conditions. To achieve high quality, silent flow, the tunnel designers placed four fine mesh screens in the settling chamber and 3500 square feet of sound absorbing panels at strategic locations. Fortunately, the low drive power demands in a cryogenic tunnel also reduce noise levels. The NTF is expected to be the quietest of all transonic facilities.
Like most modern scientific and engineering facilities, the NTF is highly computerized-four separate computers, in fact. These computers will handle data acquisition and display, tunnel and test model control, data base management, communications, and facility monitoring. Data acquisition rates will reach 50000 points per second, so that even very short runs (minutes rather than hours) in the cryogenic mode can still be highly productive. The NTF epitomizes modern, computerized, highly automated scientific facilities. It should give the United States a full 5 year lead over other countries. It will provide aspiring young aerodynamicists the wherewithal to design the aerospacecraft of the future.
At Ames, NASA is upgrading the already impressive 40 x 80 foot tunnel to higher speeds and even ....
....larger size-specifically a dual section tunnel 40 x 80-feet and 80 x 120-feet, which will be one of the world's largest manmade structures and visible to the naked eye from low Earth orbit.
The original Ames 40 x 80-foot tunnel began operation in 1944 and has seen over 100 aircraft in its test section, spanning 35 years of aviation history, from World War II fighters to the Space Shuttle. Why modify such a successful facility? The primary driving force is the need to test evolving VTOL craft full scale. These vehicles are becoming bigger and faster. To flight test them without wind tunnel trials can lead to disaster, as the history of VTOL flight has repeatedly demonstrated. For example, two U. S. rotary wing aircraft did bypass full scale tunnel tests and subsequently crashed during flight testing. One of them encountered technical problems so serious that the $400 million development program was terminated. In contrast, three other VTOL aircraft did take advantage of the Ames 40 x 80-foot tunnel. They too failed dramatically at first, but the technical difficulties were resolved in the wind tunnel prior to flight testing, and these craft eventually succeeded.
Of crucial importance in VTOL testing is the elimination of tunnel wall interference. Ames engineers, however, were originally stymied in their plans to both expand the 40 x 80-foot test section to 80 x 120-feet and raise airspeeds to 300 knots. It would have cost far too much to reach both objectives. Instead, a compromise was reached. Tunnel power was raised to 135 000 horsepower-enough to attain 300 knots in the 40 x 80-foot test section-yet still sufficient to drive an 80 x 12O-foot nonreturn leg at more than 100 knots. The old 40 x 80-foot single return circuit would remain essentially intact, but a large, complex system of turning vanes and louvers would deflect flow into the grafted 80 x 12O-foot leg when desired.
Modifications of existing structures can frequently be more frustrating than building a new one. A central problem in this instance was replacing the original six 6000 horsepower drive motors with a 135 000 horsepower system, all the while maintaining the same space. Drive system engineering had fortunately improved greatly in 35 years. By using modern synchronous motors with controllable pitch fans and solid state variable frequency speed controls, the new drive system was squeezed into the existing motor support structure. Miniaturization, however, was not the goal in the 80 x 12O-foot appendage to the old tunnel circuit. The new structure, in fact, was big enough to remind the viewer of the hoary adage about the "tail wagging the dog.
With very few exceptions, the potential of a new discovery in aerodynamics will not be realized until it is fully validated in three ways: (1) through theoretical analysis, (2) in wind tunnels, and (3) in actual flight testing. The discovery may arise in either theory or practice, but these three confirmations must occur for it to be widely accepted. This triad is the cornerstone of aeronautical progress.
The famed NACA cowl of 1928 vintage had its genesis more in experiment than theoretical analysis. In contrast, the revolutionary swept wing came from the theoretical work of the German scientist Adolph Buseman in 1935 (and independently by Robert Jones of NACA in 1944). Both testing and analysis concurred that these ideas were sound and they duly reached fruition. The supercritical wing, though, was a classic example of an incomplete triad. Both tunnel tests and flight tests demonstrated that the new airfoil held great promise in the subsonic regime, but a firm theoretical basis for the concept did not exist. Therefore, general acceptance by the aircraft industry was not forthcoming. Without theoretical underpinnings, interest in the supercritical wing waned. (The blunt, thick wing also defied conventional wisdom that high speed efficient wings had to be very thin with small leading edge radii.) The analytical cavalry came to the rescue only after a nationwide effort by aerodynamicists and mathematicians finally put the supercritical wing on a rational basis. The triad was complete and success assured.
In actuality, the theoretical leg of the supercritical wing triad was propped up not only by better theory  but by powerful, recently acquired supercomputers. These were at last powerful enough to handle the myriad terms in the complicated aerodynamic equations. The computer revolution had at last invaded the field of aerodynamics-just in time to validate the supercritical wing.
The ultimate goal of computational aerodynamics is the mathematical simulation of airflow over a complete aircraft free of any approximations over the entire speed range from subsonic to hypersonic flight. In principle, the computer can do the wind tunnel's job faster and cheaper with its flowing electrons. In addition. computational aerodynamics is not restricted by the usual wind tunnel concerns of Reynolds numbers, high temperatures, wall interference, flow quality, and so on. The Reynolds number can be made any value desired just by punching it into the computer. Flow can be mathematically perfect, and the air temperature can be pushed high enough to vaporize the simulated aircraft or lower than the liquefaction temperature of air without incurring the usual practical wind tunnel problems. These advantages make computational aerodynamics a most promising field for future exploitation.
Let us dwell a moment longer on the favorable aspects of the electronic wind tunnel. Dean Chapman, in his 1979 Dryden Lectureship for Research, dramatized the great speed of today's supercomputers. For less than $1000 and 30 minutes of computer time one can now numerically simulate flow over an airfoil using certain equations. The same computational task using the computers available 20 years ago would have cost $10 million and taken 30 years to complete.
 But there are problems even in our modern electronic Garden of Eden. The mathematical description of fluid motion is embodied in the basic Navier Stokes equations, which were propounded in 1827. The complete equations are highly complex, involving 60 partial derivative terms. The present generation of computers can handle the Navier Stokes equations as applied to a complete aircraft only if various approximations are made. At high angles of attack and high Mach numbers, where flow separation may occur, computational aerodynamics still leaves much to be desired. To improve the electronic wind tunnel, special purpose data processors are being designed especially for handling the Navier Stokes equations.
Both electronic and hardware wind tunnels will help shape the future of flight-a symbiotic partnership. For this partnership to prosper, the electronic wind tunnel must be nurtured like the long sequence of nuts and bolts facilities described in earlier chapters. To this end, a National Aerodynamic Simulation Facility (NASF) has been conceived at Ames that would, like the National Transonic Facility, serve the needs of NASA, the military, science, and industry. The NASF would first of all complement wind tunnels in the aerodynamic design process. It would also be a valuable tool for advanced research in the field of fluid dynamics. It would strive to broach the fundamental limitations of today's embryonic electronic wind tunnels, namely, inadequate computational speeds, too small memories, inappropriate computer design (architecture), and poor numerical flow models (algorithms).
The computer requirements of the proposed NASF are impressive, even in terms of modern computer superlatives. One billion arithmetic operations per second are specified-approximately 25 times the speed of current computers. The memory storage needed would be 100 times greater than present memory capacities: at least 40 million words, extensible to several hundred million words. Are the Navier Stokes equations this intimidating? When one realizes that the simulated flow field could encompass 500 000 mathematical points and that the equations themselves are unusually complex, the answer must be yes. In fact, the NASF is only an intermediate goal. Really good flow field simulation would demand a trillion arithmetic operations per second (1000 times the NASF objective), with a corresponding expansion of computer memory. Computer speeds have been increasing by a factor of ten every 8 years, so a 1000 fold increase is not an idle dream.
So much for delving into the future, but the National Aerodynamic Simulation Facility is not yet reality. The best estimates indicate that the cost of assembling the computers, software, and personnel would rival that of the National Transonic Facility (about $85 million). It is an investment in the future that we can delay but not avoid. The history of wind tunnels and flight have repeatedly demonstrated that bold steps forward go hand in hand with technological leadership.