TV Broadcast Lunar Landing

TV Broadcast Lunar Landing

The television broadcast of the 1969 Apollo Moon landing was a shared affair between the USA and Australia. Goldstone in the US began the tracking and receiving, however, a few seconds into the stepping onto the Moon, Goldstone switched to signals that were being received and uplinked from Australia. Of course, all the information here is the ‘official’ information. It is upto MoonHoax investigators to find evidence to the contrary.

[Honeysuckle Creek NSW Australia]

You will note in this video that the narrator talks about a technician that records the Moon landing of Apollo 11 by filming it from the main TV screen with his hand held movie camera. When doing this there is a bright glare that appears over the center of the screen. This is, the narrator says, caused by an attempt to adjust the TV controls so that the technician can get the best picture on his camera. Obviously the filming was for his own personal records. Later in the video, the narrator says that the images being watched are live from the Moon. However the light glare is still there suggesting that we are watching the recording and not the live event.

OTC Moree Earth Station

The Apollo TV was sent from Sydney Video via PMG lines to the OTC’s satellite earth station at Moree in northern New South Wales.

From there it was sent to the Intelsat III F4 geosynchronous telecommunications satellite above the Pacific Ocean.

This photo of the OTC’s 90 foot Moree dish was taken from the cover of a 1969 OTC publication. Scan: Colin Mackellar.

The first thing to notice here is that the Soviet Union is NOT within the  Intelsat’s footprint unless they had a receiving station inland from the Bering Sea.

[HoneySuckle Creek NSW Australia] The only place in the world where the public saw the lunar TV direct, and not via Houston, was in Australia [HoneySuckle Creek NSW Australia].

There is so much controversy about the ‘Apollo Moon’ landing broadcast. Basically the TV images and sound were picked up in Australia, [Wikipedia]The first images from the moon were upside down, so engineers on Earth operated an electronic switch on receiving the signal to correct the picture. All transmissions from the moon were in black and white. When Buzz Aldrin became the second man on the moon twenty minutes later, the picture quality had improved – after the moon-rise in Australia the signal had moved from the smaller Goldstone in California to the stronger signal received on the main on-axis receiver of the Parkes[10] radio-telescope in Australia, and then relayed via the Honeysuckle Creek station to Sydney for subsequent distribution uplink [to the US] [Wikipedia_1]

[The New Zealand Herald 21 Jul, 2017 5:00 am.] When men first set foot on the moon 48 years ago today, there was no live TV feed of the momentous event to New Zealand.
Instead, Kiwis were glued to their radios, and had to wait for a delayed TV broadcast in the early evening, after a Royal New Zealand Air Force bomber had delivered the tape from Sydney[NZ Herald]. This was the same for Perth Western Australia.

Image from Parks Radio Telliscope Apollo 11 raw footage.

The above still shot from the Apollo 11 Moon Landing show a remarably clearer view than the video that eventually was broadcast to the World. Note the strange broad stripes over he image?

This image is what we actually saw that day in 1969. The raw footage has been ‘lost’ by NASA.

In the above image noise has been reduced to show a clearer image [reduced in PSPx9].

Now this image has some areas marked out. What are they?

And now (above) we can see, with enhancements, some strange structures in the background.

The Cameras

[Wikipedia]The Apollo TV Camera refers to several television cameras used in the Apollo program’s space missions, and on the later Skylab and Apollo-Soyuz Test Project missions, in the late 1960s and 1970s. These cameras varied in design, with image quality improving significantly with each successive model. Two companies made these various camera systems: RCA and Westinghouse. Originally, these slow-scan television (SSTV) cameras, running at 10 frames-per-second (fps), produced only black-and-white pictures and first flew on the Apollo 7 mission in October 1968. A color camera — using a field-sequential color system — flew on the Apollo 10 mission in May 1969, and every mission after that. The Color Camera ran at the North American standard 30 fps. The cameras all used image pickup tubes that were initially fragile, as one was irreparably damaged during the live broadcast of the Apollo 12 mission’s first moonwalk. Starting with the Apollo 15 mission, a more robust, damage-resistant camera was used on the lunar surface. All of these cameras required signal processing back on Earth to make the frame rate and color encoding compatible with analog broadcast television standards.

Starting with Apollo 7, a camera was carried on every Apollo Command Module (CM) except Apollo 9. For each lunar landing mission, a camera was also placed inside the Lunar Module (LM) Descent Stage’s Modularized Equipment Stowage Assembly (MESA). Positioning the camera in the MESA made it possible to telecast the astronauts’ first steps as they climbed down the LM’s ladder at the start of a mission’s first moonwalk/EVA. Afterwards, the camera would be detached from its mount in the MESA, mounted on a tripod and carried away from the LM to show the EVA’s progress; or, mounted on a Lunar Roving Vehicle (LRV), where it could be remotely controlled from Mission Control on Earth.

RCA Command Module TV camera

Apollo 7 slow-scan TV, transmitted by the RCA Command Module TV Camera.

NASA decided on initial specifications for TV on the Apollo Command Module (CM) in 1962.[2][Note 1] Both analog and digital transmission techniques were studied, but the early digital systems still used more bandwidth than an analog approach: 20 MHz for the digital system, compared to 500 kHz for the analog system.[2] The video standard for the Block I CM meant that the analog video standard for early Apollo missions was set as follows: monochrome signal, with 320 active scan lines, and progressively scanned at 10 frames-per-second (fps). RCA was given the contract to manufacture such a camera.[2] It was understood at the time that motion fidelity from such a slow-scan television system (SSTV) would be less than standard commercial television, but deemed sufficient considering that astronauts would not be moving quickly in orbit, or even on the Lunar surface.[5]

Video signal processing

Since the camera’s scan rate was much lower than the approximately 30 fps for NTSC video,[Note 2] the television standard used in North America at the time, a real-time scan conversion was needed to be able to show its images on a regular TV set. NASA selected a scan converter manufactured by RCA to convert the black-and-white SSTV signals from the Apollo 7, 8, 9 and 11 missions.[6]

When the Apollo TV camera radioed its images, the ground stations received its raw unconverted SSTV signal and split it into two branches. One signal branch was sent unprocessed to a fourteen-track analog data tape recorder where it was recorded onto fourteen-inch diameter reels of one-inch-wide analog magnetic data tapes at 3.04 meters per second.[7] The other raw SSTV signal branch was sent to the RCA scan converter where it would be processed into an NTSC broadcast television signal.[7]

The conversion process started when the signal was sent to the RCA converter’s high-quality 10-inch video monitor where a conventional RCA TK-22 television camera — using the NTSC broadcast standard of 525 scanned lines interlaced at 30 fps — merely re-photographed its screen. The monitor had persistent phosphors, that acted as a primitive framebuffer.[8] An analog disk recorder, based on the Ampex HS-100 model, was used to record the first field from the camera.[8] It then fed that field, and an appropriately time-delayed copy of the first field, to the NTSC Field Interlace Switch (encoder). The combined original and copied fields created the first full 525-line interlaced frame and the signal was then sent to Houston.[8] It repeated this sequence five more times, until the system imaged the next SSTV frame.[8] It then repeated the whole process with each new frame downloaded from space in real time.[9] In this way, the chain produced the extra 20 frames per second needed to produce flicker-free images to the world’s television broadcasters.[6]

This live conversion was crude compared to early 21st-century electronic digital conversion techniques. Image degradation was unavoidable with this system as the monitor and camera’s optical limitations significantly lowered the original SSTV signal’s contrast, brightness and resolution. The video seen on home television sets was further degraded by the very long and noisy analog transmission path.[10] The converted signal was sent by satellite from the receiving ground stations to Houston, Texas. Then the network pool feed was sent by microwave relay to New York, where it was broadcast live to the United States and the world.[11]

Operational history

Earth seen during the Apollo 8 live TV transmission on 23 December 1968 using the 100 mm telephoto lens on the RCA Command Module TV Camera.

Apollo 7 and Apollo 8 used an RCA slow-scan, black-and-white camera.[12] On Apollo 7, the camera could be fitted with either a wide angle 160 degree lens, or a telephoto lens with a 9 degree angle of view.[13] The camera did not have a viewfinder or a monitor, so astronauts needed help from Mission Control when aiming the camera in telephoto mode.[Note 3]


The camera used interchangeable lenses, including a wide-angle lens with a 160 degree field-of-view, and a 100 mm telephoto lens.[16]

Camera[Note 4]

Camera name Command Module Television Camera, Block I
Supplier RCA
Sensor Vidicon Tube
Sensor size one-inch tube
Field Scan type progressive scan
Frame rate 10 fps
Frame size 320 scan lines
Resolution 200 lines
Color encoder monochrome
Aspect ratio 4:3
Bandwidth 500 kHz
Power Consumption 6.5 watts @ 28 volts DC
Weight 2,041 grams (72.0 oz)
Dimensions 210 mm × 95 mm × 76 mm (8.3 in × 3.7 in × 3.0 in) LxHxW
Lens mount type Bayonet

Westinghouse Apollo Lunar Television Camera


Lunar Module training mockup, showing relative position of deployed camera

In October 1964, NASA awarded Westinghouse the contract for the Lunar TV Camera.[19] Stan Lebar, the Program Manager for the Apollo Lunar TV Camera, headed the team at Westinghouse that developed the camera that brought pictures from the Moon’s surface.

The camera had to be designed to survive extreme temperature differences on the lunar surface, ranging from 121 °C (250 °F) in daylight to −157 °C (−251 °F) in the shade.[10]Another requirement was to be able to keep the power to approximately 7 watts, and fit the signal into the narrow bandwidth on the LM’s S-band antenna, which was much smaller and less powerful than the Service Module’s antenna.[20][Note 5]

Operational history

The camera was first tested in space during the Apollo 9 mission in March 1969.[21] The camera was stowed in the LM, and it used the LM’s communications systems to evaluate their performance before lunar operations began.[22] This meant that the CM did not carry a video camera for this mission.[23] It was next used on Apollo 11, carried in the LM’s descent stage, in the quad 4 Modularized Equipment Stowage Assembly (MESA). It was from the MESA where it captured humanity’s first step on another celestial body on 21 July 1969.[21] Apollo 11 would be the first and last time the camera was used on the Lunar surface; however, it flew as a backup camera on the Apollo missions from Apollo 13 to Apollo 16, in case the color cameras suffered a similar fate as the Apollo 12 camera.[1]


The camera’s dimensions were 269 mm × 165 mm × 86 mm (10.6 in × 6.5 in × 3.4 in) in size, and weighed 3.29 kilograms (7.3 lb). It consumed 6.50 watts of power. Its bayonet lens mount allowed for quick changes for the two interchangeable lenses used on Apollo 11: a wide-angle and a lunar day lens.[24][Note 6]


NASA Component No. SEB16101081-701[26]
Supplier Westinghouse[1]
Sensor Westinghouse WL30691 Secondary Electron Conduction Tube (SEC)[27]
Sensor size 1/2 inch tube[28]
Field Scan type progressive scan
Frame rate 10 fps at 320 lines, 0.625 fps at 1280 lines[29]
Frame size 320 scan lines (10 fps) and 1280 scan lines (0.625 fps)[29]
Resolution 200 lines (10 fps),[30] 500 lines (0.625 fps)[31]
Color encoder monochrome[1]
Aspect ratio 4:3[29]
Bandwidth 500 kHz[29]
Power Consumption 6.5 watts @ 24—31.5 volts DC[32]
Weight 3.29 kilograms (7.3 lb)[24]
Dimensions 269 mm × 165 mm × 86 mm (10.6 in × 6.5 in × 3.4 in) LxHxW[24]
Lens mount type Bayonet[24]
  • Photo of the high-quality SSTV image received from Apollo 11 at Honeysuckle Creek Tracking Station
  • Photo of the high-quality SSTV image before the scan conversion
  • Photo of the high-quality SSTV image before the scan conversion
  • Westinghouse camera on the Lunar surface during Apollo 11

Westinghouse Lunar Color Camera

Choosing a color process

Stan Lebar, the project manager for Westinghouse’s Apollo Television Cameras, shows the Field-Sequential Color Camera on the left and the Monochrome Lunar Surface Camera on the right.

Color broadcast studio television cameras in the 1960s, such as the RCA TK-41, were large, heavy and power-hungry beasts. They used three imaging tubes to generate red, green and blue (RGB) video signals which were combined to produce a composite color picture. These cameras required complex optics to keep the tubes aligned. Since temperature variations and vibration would easily put a three-tube system out of alignment, a more robust system was needed for lunar surface operations.[34]

In the 1940s, CBS Laboratories invented an early color system that utilized a wheel, containing six color filters, rotated in front of a single video camera tube to generate the RGB signal.[35] Called a field-sequential color system, it used interlaced video, with sequentially alternating color video fields to produce one complete video frame. That meant that the first field would be red, the second blue, and the third field green — matching the color filters on the wheel and also in a different order than NTSC.[35] This system was both simpler and more reliable than a standard three-tube color camera, and more power-efficient.[34]

The camera

Lebar and his Westinghouse team wanted to add color to their camera as early as 1967, and they knew that the CBS system would likely be the best system to study.[36] The Westinghouse Lunar Color Camera used a modified version of CBS’s field-sequential color system.[35] A color wheel, with six filter segments, was placed behind the lens mount. It rotated at 9.99 revolutions per second, producing a scan rate of 59.94 fields per second, the same as NTSC video. Synchronization between the color wheel and pickup tube’s scan rate was provided by a magnet on the wheel, which controlled the sync pulse generator that governed the tube’s timing.

The Color Camera used the same SEC video imaging tube as the monochrome Lunar Camera flown on Apollo 9. The camera was larger, measuring 430 millimetres (17 in) long, including the new zoom lens. The zoom lens had a focal length variable from 25 mm to 150 mm, with a zoom ratio rated at 6:1. At its widest angle, it had a 43-degree field of view, while in its extreme telephoto mode, it had a 7-degree field of view. The aperture ranged from F4 to F44, with a T5 light transmittance rating.[27]

Color decoding & signal processing

Signal processing was needed at the Earth receiving ground stations to compensate for the Doppler Effect, caused by the spacecraft moving away from or towards the Earth. The Doppler Effect would distort color, so a system that employed two videotape recorders (VTRs), with a tape-loop delay to compensate for the effect, was developed.[35] The cleaned signal was then transmitted to Houston in NTSC-compatible black and white.[Note 8]

Unlike the CBS system that required a special mechanical receiver on a TV set to decode the color, the signal was decoded in Houston’s Mission Control Center. This video processing occurred in real time. The decoder separately recorded each red, blue and green field onto an analog magnetic disk recorder. Acting as a framebuffer, it then sent the coordinated color information to an encoder to produce a NTSC color video signal and then released to the broadcast pool feed.[34] Once the color was decoded, scan conversion was not necessary because the color camera ran at the same 60-fields-per-second video interlace rate as the NTSC standard.[36]

Operational history

It was first used on the Apollo 10 mission. The camera used the Command Module’s extra S-band channel and large S-band antenna to accommodate the camera’s larger bandwidth. It was only used in the Lunar Module when it was docked to the Command Module. Unlike the earlier cameras, it contained a portable video monitor that could be either directly attached to the camera or float separately. Combined with the new zoom lens, it allowed the astronauts to have better precision with their framing.[35]

Apollo 12 was the first mission to use the color camera on the lunar surface. About 42 minutes into telecasting the first EVA, astronaut Alan Bean inadvertently pointed the camera at the Sun while preparing to mount it on the tripod. The Sun’s extreme brightness burned out the video pickup tube, rendering the camera useless. When the camera was returned to Earth, it was shipped to Westinghouse, and they were able to get an image on the section of the tube that wasn’t damaged.[38] Procedures were re-written in order to prevent such damage in the future, including the addition of a lens cap to protect the tube when the camera was repositioned off the MESA.

Apollo 14 EVA frame demonstrates the “blooming” issue with Color Camera.

The color camera successfully covered the lunar operations during the Apollo 14 mission in 1971. Image quality issues appeared due to the camera’s automatic gain control(AGC) having problems getting the proper exposure when the astronauts were in high contrast light situations, and caused the white spacesuits to be overexposed or “bloom”. The camera did not have a gamma correction circuit. This resulted in the image’s mid-tones losing detail.[39]

After Apollo 14, it was only used in the Command Module, as the new RCA-built camera replaced it for lunar surface operations. The Westinghouse Color Camera continued to be used throughout the 1970s on all three Skylab missions and the Apollo–Soyuz Test Project.

The 1969–1970 Emmy Awards for Outstanding Achievement in Technical/Engineering Development were awarded to NASA for the conceptual aspects of the color Apollo television camera and to Westinghouse Electric Corporation for the development of the camera.[40]



NASA Component No. SEB16101081-701[26]
Supplier Westinghouse
Sensor Westinghouse WL30691Secondary Electron Conduction Tube (SEC)[41]
Resolution more than 200 TV lines (SEC sensor – 350 TV Lines in vertical dimension)
Field Scan rate 59.94 fields-per-second monochrome (color filters alternated between each field)[42]
Frame rate 29.97 frames-per-second [41]
Frame size 525 lines
Color encoder Field-sequential color system[43]
Bandwidth 2 MHz to 3 MHz (Unified S-band bandwidth restrictions)
Power Consumption 17.5 watts @ 28 volts DC[44]
Weight 5 kg (11 lb)[43][44]
Dimensions 287 mm × 170 mm × 115 mm (11.3 by 6.7 by 4.5 inches) LxHxW with handle folded [45]
Lens mount type C mount[46]


NASA Component No. SEB16101081-703[26]
Supplier Angénieux[45]
Focal length 25mm—150mm[47]
Zoom ratio 6:1[47]
Aperture F4 to F44[47]
Light transmission T5[48]
Weight 590 g (21 oz)[44]
Dimensions 145 mm (5.7 in) long, 58.9 mm (2.32 in) lens diameter [45]
Lens mount type C mount ANSI 1000-32NS-2A Thread[46]
  • Apollo 10 TV image of Earth
  • Apollo 11 TV image
  • Westinghouse color camera on the Lunar surface during Apollo 12
  • Edgar Mitchell with the Apollo 14 camera

RCA J-Series Ground-Commanded Television Assembly (GCTA)

Due to Apollo 12’s camera failure, a new contract was awarded to the RCA Astro Electronics facility in East Windsor, NJ. The RCA system used a new, more sensitive and durable TV camera tube. The design team was headed by Robert G. Horner. The team used the newly developed SIT pickup tube. The improved image-quality was obvious to the public with the RCA camera’s better tonal detail in the mid-range, and the lack of blooming that was apparent in the previous missions.

The system was composed of the Color Television Camera (CTV) and the Television Control Unit (TCU). These were connected to the Lunar Communications Relay Unit (LCRU) when mounted on the Lunar Roving Vehicle (LRV). Like the Westinghouse Color Camera, it used the field-sequential color system, and used the same ground-station signal processing and color decoding techniques to produce a broadcast NTSC color video signal.

On Apollo 15 the camera produced live images from the LM’s MESA, just as the previous missions did. It was repositioned from the MESA onto a tripod, where it photographed the Lunar Rover Vehicle (LRV) being deployed. Once the LRV was fully deployed, the camera was mounted there and controlled by commands from the ground to tilt, pan, and zoom in and out. This was the last mission to have live video of the mission’s first steps via the MESA, as on the following flights it was stowed with the LRV.

  • Usage: Apollo 15 (lunar surface), Apollo 16 (lunar surface) and Apollo 17 (lunar surface)
  • Resolution: more than 200 TV lines (SIT sensor – 600 TV Lines)
  • Scan rate: 59.94+ fields/s monochrome (color filters alternated between each field) / 29.97+ frame/s / 525 lines/fr / 15734.26+ lines/s
  • Color: Field-sequential color system camera
  • Spectral response: 350–700 nm
  • Gamma: 1.0
  • Sensitivity: > 32 dB signal to noise ratio
  • Dynamic range: > 32:1
  • Bandwidth: up to 5 MHz
  • Sensor: Silicon Intensifier Target (SIT) Tube
  • Optics: 6x zoom, F/2.2 to F/22
  • Automatic light control (ALC): average or peak scene luminance
  • GCTA transmission from the LRV
  • Apollo 15 television camera and high-gain antenna
  • Apollo 16 television camera. Notice the sunshade attached to the top of the lens, a feature first used on Apollo 16.


Cameras used, CM = Command Module, LM = Lunar Module

  • Apollo 7: RCA B&W SSTV (CM)
  • Apollo 8: RCA B&W SSTV (CM)
  • Apollo 9: Westinghouse B&W (LM)
  • Apollo 10: Westinghouse color (CM)
  • Apollo 11: Westinghouse color (CM), Westinghouse B&W (LM)
  • Apollo 12: Westinghouse color (CM & LM)
  • Apollo 13: Westinghouse color (CM & LM), Westinghouse B&W was a backup for LM (not used), LM camera was not used
  • Apollo 14: Westinghouse color (CM & LM), Westinghouse B&W was a backup for LM (not used)
  • Apollo 15: Westinghouse color (CM), RCA GCTA (LM), Westinghouse B&W was a backup for LM (not used)
  • Apollo 16: Westinghouse color (CM), RCA GCTA (LM), Westinghouse B&W was a backup for LM (not used)
  • Apollo 17: Westinghouse color (CM), RCA GCTA (LM)[Wikipedia]


  1.   NASA decided to go with a new communications system for the Apollo program that routed all communications signals simultaneously through the Unified S-Band (USB) system. All communication between the spacecraft and ground was handled by the USB, transmitting on the 2287.5 frequency for the CM, and at 2282.5 for the LM. It had a 3 MHz allotment for all communications that were divided into seven components: Voice, Telemetry, Television, Biomedical data, Ranging, Emergency Voice, Emergency Key.[3] The reason why the video signal had to be compressed into such a narrow bandwidth was due the way signals were allocated bandwidth. After allocating 1.25 MHz to Voice, and 1.024 MHz for Telemetry, only about 700 KHz was available for all other communication signals. In order to produce a clean frequency modulated (FM) transmission for video from the LM on the lunar surface, the Ranging signal was omitted. The Block II CM actually had a second 3 MHZ USB that could have allowed better resolution and scan rates, but that wasn’t supported until the Apollo 10 mission in 1969.[4]
  2.   For the purposes of clarity and simplicity in this article, 60 fields and 30 frames per second are used. NTSC actually runs at 59.94 fields per second, and 29.97 frames per second. Two interlaced fields create one complete video frame.
  3.   The camera’s lack of either a viewfinder or monitor was apparent when Apollo 8 tried to frame the Earth on their second broadcast from space. The Earth bounced around, often out of view, and Mission Control had to direct the astronauts to move the camera to bring it back into frame.[14] Apollo 8 astronaut William Anders said during the second telecast, that “I hope the next camera has a sight on it,” referring to the RCA camera’s lack of a sighting device.[15]
  4.   All specifications for the RCA Command Module TV Camera are found in Coan’s Apollo Experience Report — Television Systems, except its weight, which is found in Goodwin’s Apollo 7: The Mission Reports.[17][18]
  5.   Since digital compression video techniques weren’t practical at the time (though studied by NASA as a possibility in 1965 in document NASA-CR-65508), the signal was “compressed” by simple analog means, starting by not using color, reducing the image resolutionfrom the NTSC standard 525 lines to 320 lines, and reducing the frame rate from 30 fps to 10 fps. In this way, the Lunar TV camera was able to shrink the video signal by 95 percent less than a standard NTSC one. After Apollo 11, a larger S-band antenna was deployed by astronauts during their first EVA, eventually allowing better video from the lunar surface.[20]
  6.   There were actually four lenses developed for this camera including the lunar day lens and the wide angle lenses. The other two lenses were the lunar night lens and a 100 mm telephoto lens.[25]
  7.   All specifications for the Westinghouse Lunar Surface TV Camera found in Lebar’s Apollo Lunar Television Camera Operations Manualpages 2-24 and A-11.[33]
  8.  The unprocessed signal from the moon, with its fluctuating TV synchronization signals, was sent to the first VTR and was recorded on 2-inch tape. The tape was not spooled on that machine, but instead, was played back on the second VTR, using the steady house sync signal to play it back and fix any synchronization issues caused by the Doppler Effect (this timebase correction is now accomplished by digital methods since the mid-1970s).[37][Wikipedia]


  1. O’Neil (2009a).
  2. Coan (1973), p. 4.
  3. Peltzer (1966), p. 2.
  4. Wood (2005), p. 1.
  5. Lebar & Hoffman (1967), p. 4.
  6. Steven-Boniecki (2010), p. 129.
  7. Sarkissian (2006), p. 8.
  8. Wood (2005), pp. 5–6.
  9. Sarkissian (2006), p. 6.
  10. Von Baldegg (2012).
  11. Steven-Boniecki (2010), p. 130.
  12. Wood (2005), pp. 1—2.
  13. Steven-Boniecki (2010), p. 55.
  14. Wilford (1971), p. 190.
  15. Associated Press (1968), p. 1.
  16. Coan (1973), p. 8.
  17. Coan (1973), pp. 4—8.
  18. Godwin (2000), p. 44.
  19. Steven-Boniecki (2010), p. 54.
  20. Windley (2011).
  21. Steven-Boniecki (2010), pp. 80–81.
  22.  Wood (2005), p. 8.
  23. Steven-Boniecki (2010), p. 79.
  24. Sarkissian (2001), p. 292.
  25. Lebar (1968), p. 2-24.
  26. Westinghouse (1971), pp. 1-11.
  27. Niemyer (1969), p. 4.
  28. Lebar (1966), p. 17a.
  29. Lebar (1966), p. 12.
  30. Lebar (1966), p. 13.
  31. Lebar (1968), p. 2-22.
  32. Lebar (1967), pp. 1-3.
  33. Lebar (1968), pp. 2-24, A-1.
  34. O’Neil (2009b).
  35. Wetmore (1969), pp. 18, 20.
  36. Steven-Boniecki (2010), pp. 94—103.
  37. Wood (2005), p. 12.
  38. Wood, pp. 25—28.
  39. Wood (2005), pp. 31—32.
  40. Pearson (1969), p. B7.
  41. Niemyer, Jr. (1969), p. 4.
  42. Niemyer, Jr. (1969), p. 5.
  43. Niemyer, Jr. (1969), p. 1.
  44. Westinghouse (1971), pp. 1-3.
  45. Westinghouse (1971), pp. 1-5—1-6.
  46. Westinghouse (1971), pp. 1-9—1-10.
  47. Westinghouse (1971), pp. 2-1.
  48. Westinghouse (1971), pp. 3-9.


  • Associated Press (1968-12-24). “Earth sees itself from Apollo”. The Globe and Mail. Toronto. p. 1.
  • Coan, Paul M. (November 1973), “Apollo Experience Report – Television System”, in Jones, Eric M.; Glover, Ken, Apollo Lunar Surface Journal (pdf), Washington, DC: NASA (published 1996–2013), archived (PDF) from the original on 20 October 2013, retrieved 2013-10-20, Originally published by NASA HQ as NASA Technical Note TN-A7476.
  • Lebar, Stanley; Hoffman, Charles P. (6 March 1967), “TV show of the century: A travelogue not atmosphere”, in Jones, Eric M.; Glover, Ken, Apollo Lunar Surface Journal (pdf), Washington, DC: NASA (published 1996–2013), archived (PDF) from the original on 20 October 2013, retrieved 2013-10-20, Originally published in ELECTRONICS, published by McGraw Hill (1967).
  • Lebar, Stanley (30 August 1968), “Apollo Lunar Television Camera Operations Manual”, in Jones, Eric M.; Glover, Ken, Apollo Lunar Surface Journal (pdf), Washington, DC: NASA (published 1996–2013), archived (PDF) from the original on 20 October 2013, retrieved 2013-10-20
  • Lebar, Stanley (Summer 1997). “The Color War goes to the Moon” (PDF). Invention and Technology. Retrieved 2013-10-18.
  • Niemyer, Jr., L. L. (1969-09-16), “Apollo Color Camera”, in Jones, Eric M.; Glover, Ken, Apollo Lunar Surface Journal (pdf), Washington, DC: NASA (published 1996–2013)
  • O’Neal, James E. (6 July 2009). “TV’s Longest Remote”. TV Technology. New York: NewBay Media. Archived from the original on 18 October 2013. Retrieved 2013-10-18.
  • O’Neal, James E. (21 July 2009). “Equipping Apollo for Color Television”. TV Technology. New York: NewBay Media. Archived from the original on 18 October 2013. Retrieved 2013-10-18.
  • O’Neal, James E. (6 August 2009). “Search for Missing Recordings Ends”. TV Technology. New York: NewBay Media. Archived from the original on 18 October 2013. Retrieved 2013-10-18.
  • Pearson, Howard (1969-06-09). “Emmy Awards to Top Shows”. Deseret News. Salt Lake City, Utah. p. B7. Retrieved 2013-10-15.
  • Peltzer, K. E. (1966), “Apollo Unified S-Band System”, in Jones, Eric M.; Glover, Ken, Apollo Lunar Surface Journal (pdf), Washington, DC: NASA (published 1996–2013), archived (PDF) from the original on 20 October 2013, retrieved 2013-10-20
  • RCA (25 February 1972), Ground-Commanded Television Assembly (GCTA) (pdf), Houston: NASA, archived (PDF) from the original on 20 October 2013, retrieved 2013-10-20
  • Sarkissian, John M. (2001). “On Eagle’s Wings: The Parkes Observatory’s Support of the Apollo 11 Mission” (PDF). Publications of the Astronomical Society of Australia. Melbourne: CSIRO Publishing. 18: 287–310. Bibcode:2001PASA…18..287S. doi:10.1071/as01038. Archived (PDF) from the original on 18 October 2013. Retrieved 2013-10-17.
  • Sarkissian, John (21 May 2006). “The search for the Apollo 11 SSTV tapes” (pdf). CSIRO Parkes Observatory. Archived (PDF) from the original on 18 October 2013. Retrieved 2013-10-15.
  • Steven-Boniecki, Dwight (2010). Live TV From the Moon. Burlington, Ontario: Apogee Books. ISBN 978-1-926592-16-9.
  • Von Baldegg, Kasia Cieplak-Mayr (20 July 2012). “1 Small Step for a Cam: How Astronauts Shot Video of the Moon Landing”. The Atlantic. Washington, DC. Archived from the original on 16 October 2013. Retrieved 2013-10-16.
  • Westinghouse (1 June 1971), Apollo Color Television Subsystem Operation Manual and Training Manual (pdf), Houston: NASA, archived (PDF) from the original on 19 October 2013, retrieved 2013-10-19
  • Wetmore, Warren C. (1969-05-26). “Docking Transmitted Live in First Color TV From Space”. Aviation Week & Space Technology. Washington, DC. pp. 18, 20.
  • Wilford, John Noble (1971). We Reach the Moon: The New York Times Story of Man’s Greatest Adventure. New York: Bantam Books. ISBN 978-0-552-08205-1.
  • Windley, Jay (2011). “Technology: TV Quality”. Moon Base Clavius. Salt Lake City, Utah: Archived from the original on 9 December 2011. Retrieved 2011-12-09.
  • Wood, Bill (2005), “Apollo Television”, in Jones, Eric M.; Glover, Ken, Apollo Lunar Surface Journal (pdf), Washington, DC: NASA (published 1996–2013)
  • [Wikipedia ]

Leave a Comment