If this were really the case, I'm surprised the US government didn't engage in antitrust action.
The first book of David Brin’s Uplift series was written in 1980 and takes place on an antigravity spaceship carrying alien ambassadors that can penetrate deep into the Sun. Yet one of the major plot points is someone using the onboard darkroom to develop pictures that reveal something essential.
I’m hoping someone would make a new sci-fi movie with a vintage aesthetic that would intentionally emphasize and magnify this old-school analog awesomeness of galactic empires that seem to entirely lack integrated circuits. Apple TV’s “Silo” has a wonderful production design but it’s too claustrophobic to fulfill my wish.
“The Mote in God’s Eye” would be my pick if I could get any IP developed with this approach.
Idea for a sci fi novel: total reliance on chatbots that predict what you want to hear based on the average of the internet ends the astonishing run of innovation we've had since the industrial revolution, and returns us to the situation humanity has been in for most of our history, in which technology develops slowly, if at all. What do things look like in a thousand years, when we're still relying on the current equivalent of slide rules and analog film?
Reminds me of Harry Turtledove's The Road Not Taken.
https://en.wikipedia.org/wiki/The_Road_Not_Taken_(short_stor...
This is what I hoped for Foundation, to replicate the 1940s now-retrofuturism I imagine while reading the books. Alas, it wasn't to be.
The “Foundation” we got has good moments and excellent production values, but it doesn’t seem to know or care exactly what the rules of its universe are. (I don’t like how Hari Seldon was apparently a font of semi-magical technology invented all at once and in secret…)
Plus, it’s just one of the best TV shows ever made in any genre.
But had I been in that place at that time, I would not have invented the digital camera. That guy Sasson was clearly capable far beyond the rest of us.
I consider Wozniak (obvious example) who was at the "right time and place" in the early 1970's. He at the engineering capital of the U.S. (Silicon Valley — already known by that name at the time) knowing adults in engineering fields that could get him otherwise expensive and new for the time microprocessor chips… just as the chips were becoming more affordable—just as Don Lancaster's "TV Typewriter" and the "Altair 8800" began to grace the cover of Popular Electronics…
Woz seemed to flounder, or be overwhelmed somewhat, a decade later when hacks with a 555 Timer chip, a few NAND gates or NTSC timing hijinks to get color was not where the industry was going. He took a back-seat on the engineering side.
At the same time, not to diminish Woz's skills in 1975, there were a lot of other smart kids in the "Valley" then that did have their home-brew computers become a product.
(And then so much more to unpack when you allow for Job's contributions, U.S. schools purchasing Apple computers, etc.)
[1] https://petapixel.com/how-steve-sasson-invented-the-digital-...
Edit: it's very likely that no photos exist because the tapes were being reused and there are many reasons why the camera has been nonfunctional for a long time now.
And the photos in the article of the old "instamatic" Kodak film cameras (especially that 110 pocket camera) suddenly brought back to my mind that formaldehyde-like smell of developer chemicals when I worked at a One-Hour-Photo lab when in high school.
https://petapixel.com/what-is-ccd-cmos-sensor/
and https://www.teledynevisionsolutions.com/learn/learning-cente...
Steve Sasson's tale of technical struggle in 01975 at Kodak is real, but dozens of other people were doing the same thing at the same time at different companies, or in their dormitories, because at that point the problem of building a handheld digital camera had been reduced to a problem that one guy could solve with off-the-shelf parts. In fact, earlier the same year, a digital camera design was published as a hobbyist project in Popular Electronics, using a 32×32 MOS sensor, and commercialized as the Cromemco Cyclops. (You just had to keep it plugged in; you couldn't take it with you to the Little League game, even though it was small enough to lift in one hand.) https://en.wikipedia.org/wiki/Cromemco_Cyclops
The reduction of the problem to such a manageable size was the result of numerous small advances over the previous 50 years.
Landsat 1 was a digital camera that was initially planned in 01970 and launched into space in 01972; it just weighed a tonne, so you couldn't hold it in your hand. https://directory.eoportal.org/satellite-missions/landsat-1-... says:
> It quickly became apparent that the digital image data, acquired by the MSS (Multispectral Scanner) instrument, a whiskbroom scanning device, were of great value for a broad range of applications and scientific investigations. For the first time, the data of an orbiting instrument were available in digital form, quantified at the instrument level - providing a great deal of flexibility by offering all the capabilities of digital processing, storage, and communication.
Landsat 1 was built by General Electric, RCA, NASA, and subcontractors, and the MSS digital camera component in particular was designed by Virginia Norwood at the Hughes Aircraft Company, not at Kodak.
Ranger 7 in 01964 https://en.wikipedia.org/wiki/Ranger_7 was an electronic camera that was successfully launched into the moon and returned close-range photos of it over radio links, but, as far as I can tell, it wasn't a digital camera; the RF links were analog TV signals.
Handheld electronic cameras, for a very strong person, might date back to Philo T. Farnsworth's Image Dissector in 01927 https://en.wikipedia.org/wiki/Video_camera_tube#Experiments_... or Zworykin's Iconoscope in 01933 https://en.wikipedia.org/wiki/Video_camera_tube#Iconoscope, but in practice these were only reduced to handheld-plus-backpack size in the 01950s https://en.wikipedia.org/wiki/Professional_video_camera#Hist.... Farnsworth was at the Farnsworth Television and Radio Corporation, not at Kodak. Zworykin was at Westinghouse and RCA, not at Kodak.
The first experimental digitization of a signal from an electronic camera was probably done by Frank Gray at Bell Labs, not at Kodak, in 01947, for which he invented the Gray Code. To be able to keep up with live full-motion video data, his analog-to-digital converter was a sort of cathode-ray tube with a shadow mask in it with the code cut into it; this is described in patent 2,632,058, granted in 01953: https://patentimages.storage.googleapis.com/a3/d7/f2/0343f5f....
The video camera tubes that were the only way to build electronic cameras up to the 50s, and which made the cameras large and heavy, were supplanted by CCDs like the 100×100 Fairchild MV-101 that Sasson used in his prototype at Kodak. The CCD was developed by Smith and Boyle at Bell Labs, not at Kodak, in 01969–70: https://en.wikipedia.org/wiki/Charge-coupled_device
However, any DRAM chip is also an image sensor, which is why they are encapsulated in black epoxy to prevent them from sensing light; without the CCD, we would have had CMOS image sensors anyway just because of the light-sensitivity of silicon. In fact, the Cromemco Cyclops used just such a chip.
The fundamental thing that made digital cameras not just possible but inevitable was microelectronics, a technology which owes its existence in 01975 to a long series of innovations including the point-contact transistor (Bardeen and Brattain, 01947, Bell Labs, not at Kodak); the junction transistor (Shockley, 01948, Bell Labs, not at Kodak); the monolithic integrated circuit (Noyce, 01959, Fairchild Semi, not at Kodak); the planar process (Hoerni, 01959, Fairchild Semi, not at Kodak); the MOSFET (Kahng and Atalla, 01959, Bell Labs, not at Kodak); the self-aligned silicon gate (Faggin, 01968, Fairchild Semi, not at Kodak); and, as mentioned in the article, the microprocessor. The microprocessor was overdetermined in the same way as the handheld digital camera, and arose basically simultaneously at RCA, Motorola, TI, and Intel, but whoever we decide invented the microprocessor, it certainly wasn't done at Kodak.
That's fine by me. The informative posts are worth it.
A lot of it was because the film people kneecapped the digital folks.
Film was very profitable.
Until it wasn't.
The company that I worked for, was a classic film company. When digital was first getting a foothold (early 1990s), I used to get lectures about how film would never die...etc.
A few years later, it was as if film never existed. The transition was so sudden, and so complete, that, if you blinked, you missed it.
Years later, I saw the same kind of thing happen to my company, that happened to Kodak.
The iPhone came out, with its embedded camera, and that basically killed the discrete point-and-shoot market, which was very profitable for my company.
When the iPhone first came out, the marketing folks at my company laughed at it.
Then, they stopped laughing.
https://petapixel.com/why-kodak-died-and-fujifilm-thrived-a-...
TL;DR: Fujifilm diversified quickly, Kodak clung to the film business for far too long.
This part reminded me of the Black Triangle (2004):
This is a very common story from what I understand, whether the intent is either “if you can’t beat them, buy them!” or even if it’s just to grow.
In Kodak’s case, I wonder if both those that saw it as the future and those that saw it as the end wanted to support and control it.
Also, it never ceases to amaze that some of the best things and the most dangerous things are (1) not those that you planned on and (2) involve someone bending and breaking rules to persue a passion project.
Other companies had already invented the CCD, it was only a matter of time before someone would digitise the signal and pair it with a storage device. It was an obvious concept.
All Kodak really did was develop an obvious concept into a prototype many years before it could be viable, and then receive a patent for it.