Home Blog Page 99

The History of the First Chocolate

0

The History of the First Chocolate, In 1847, British chocolatier J.S. Fry and Sons created the first chocolate bar molded from a paste made of sugar, chocolate liquor and cocoa butter. Swiss chocolatier Daniel Peter is generally credited for adding dried milk powder to chocolate to create milk chocolate in 1876.

 

 

 

A Brief History of Chocolate - readcacao

 

 

 

 

When most of us hear the word chocolate, we picture a bar, a box of bonbons, or a bunny. The verb that comes to mind is probably “eat,” not “drink,” and the most apt adjective would seem to be “sweet.” But for about 90 percent of chocolate’s long history, it was strictly a beverage, and sugar didn’t have anything to do with it.”I often call chocolate the best-known food that nobody knows anything about,” said Alexandra Leaf, a self-described “chocolate educator” who runs a business called Chocolate Tours of New York City.The terminology can be a little confusing, but most experts these days use the term “cacao” to refer to the plant or its beans before processing, while the term “chocolate” refers to anything made from the beans, she explained. “Cocoa” generally refers to chocolate in a powdered form, although it can also be a British form of “cacao.”

 

 

 

 

 

 

 

 

Etymologists trace the origin of the word “chocolate” to the Aztec word “xocoatl,” which referred to a bitter drink brewed from cacao beans. The Latin name for the cacao tree, Theobroma cacao, means “food of the gods.”

Many modern historians have estimated that chocolate has been around for about 2000 years, but recent research suggests that it may be even older.

In the book The True History of Chocolate, authors Sophie and Michael Coe make a case that the earliest linguistic evidence of chocolate consumption stretches back three or even four millennia, to pre-Columbian cultures of Mesoamerica such as the Olmec.

Last November, anthropologists from the University of Pennsylvania announced the discovery of cacao residue on pottery excavated in Honduras that could date back as far as 1400 B.C.E. It appears that the sweet pulp of the cacao fruit, which surrounds the beans, was fermented into an alcoholic beverage of the time.

 

 

 

 

 

 

 

“Who would have thought, looking at this, that you can eat it?” said Richard Hetzler, executive chef of the café at the Smithsonian’s National Museum of the American Indian, as he displayed a fresh cacao pod during a recent chocolate-making demonstration. “You would have to be pretty hungry, and pretty creative!”

It’s hard to pin down exactly when chocolate was born, but it’s clear that it was cherished from the start. For several centuries in pre-modern Latin America, cacao beans were considered valuable enough to use as currency. One bean could be traded for a tamale, while 100 beans could purchase a good turkey hen, according to a 16th-century Aztec document.

Both the Mayans and Aztecs believed the cacao bean had magical, or even divine, properties, suitable for use in the most sacred rituals of birth, marriage and death. According to Chloe Doutre-Roussel’s book The Chocolate Connoisseur, Aztec sacrifice victims who felt too melancholy to join in ritual dancing before their death were often given a gourd of chocolate (tinged with the blood of previous victims) to cheer them up.

Sweetened chocolate didn’t appear until Europeans discovered the Americas and sampled the native cuisine. Legend has it that the Aztec king Montezuma welcomed the Spanish explorer Hernando Cortes with a banquet that included drinking chocolate, having tragically mistaken him for a reincarnated deity instead of a conquering invader. Chocolate didn’t suit the foreigners’ tastebuds at first –one described it in his writings as “a bitter drink for pigs” – but once mixed with honey or cane sugar, it quickly became popular throughout Spain.

By the 17th century, chocolate was a fashionable drink throughout Europe, believed to have nutritious, medicinal and even aphrodisiac properties (it’s rumored that Casanova was especially fond of the stuff).  But it remained largely a privilege of the rich until the invention of the steam engine made mass production possible in the late 1700s.

In 1828, a Dutch chemist found a way to make powdered chocolate by removing about half the natural fat (cacao butter) from chocolate liquor, pulverizing what remained and treating the mixture with alkaline salts to cut the bitter taste. His product became known as “Dutch cocoa,” and it soon led to the creation of solid chocolate.

The creation of the first modern chocolate bar is credited to Joseph Fry, who in 1847 discovered that he could make a moldable chocolate paste by adding melted cacao butter back into Dutch cocoa.

 

 

 

 

 

 

 

 

By 1868, a little company called Cadbury was marketing boxes of chocolate candies in England. Milk chocolate hit the market a few years later, pioneered by another name that may ring a bell – Nestle.

In America, chocolate was so valued during the Revolutionary War that it was included in soldiers’ rations and used in lieu of wages. While most of us probably wouldn’t settle for a chocolate paycheck these days, statistics show that the humble cacao bean is still a powerful economic force. Chocolate manufacturing is a more than 4-billion-dollar industry in the United States, and the average American eats at least half a pound of the stuff per month.

In the 20th century, the word “chocolate” expanded to include a range of affordable treats with more sugar and additives than actual cacao in them, often made from the hardiest but least flavorful of the bean varieties (forastero). 

But more recently, there’s been a “chocolate revolution,” Leaf said, marked by an increasing interest in high-quality, handmade chocolates and sustainable, effective cacao farming and harvesting methods. Major corporations like Hershey’s The History of the First Chocolate have expanded their artisanal chocolate lines by purchasing smaller producers known for premium chocolates, such as Scharffen Berger and Dagoba, while independent chocolatiers continue to flourish as well.

“I see more and more American The History of the First Chocolate artisans doing incredible things with chocolate,” Leaf said. “Although, I admit that I tend to look at the world through cocoa-tinted glasses.”

The History of The First Telephone

0

The History of The First Telephone, While Italian innovator Antonio Meucci (pictured at left) is credited with inventing the first basic phone in 1849, and Frenchman Charles Bourseul devised a phone in 1854, Alexander Graham Bell won the first U.S. patent for the device in 1876.

 

 

 

telephone | History, Definition, Invention, Uses, & Facts | Britannica

 

 

 

As with many innovations, the idea for the telephone came along far sooner than it was brought to reality. While Italian innovator Antonio Meucci (pictured at left) is credited with inventing the first basic phone in 1849, and Frenchman Charles Bourseul devised a phone in 1854, Alexander Graham Bell won the first U.S. patent for the device in 1876. Bell began his research in 1874 and had financial backers who gave him the best business plan for bringing it to market.

In 1877-78, the first telephone line was constructed, the first switchboard was created and the first telephone exchange was in operation. Three years later, almost 49,000 telephones were in use. In 1880, Bell (in the photo below) merged this company with others to form the American Bell Telephone Company and in 1885 American Telegraph and Telephone Company (AT&T) was formed; it dominated telephone communications for the next century. At one point in time, Bell System employees purposely denigrated the U.S. telephone system to drive down stock prices of all phone companies and thus make it easier for Bell to acquire smaller competitors.

 

 

 

 

 

Alexander Graham Bell Using a Telephone

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

By 1900 there were nearly 600,000 phones in Bell’s telephone system; that number shot up to 2.2 million phones by 1905, and 5.8 million by 1910. In 1915 the transcontinental telephone line began operating. By 1907, AT&T had a near monopoly on phone and telegraph service, thanks to its purchase of Western Union. Its president, Theodore Vail, urged at the time that a monopoly could most efficiently operate the nation’s far-flung communications network. At the urging of the public and AT&T competitors, the government began to investigate the company for anti-trust violations, thus forcing the 1913 Kingsbury Commitment, an agreement between AT&T vice president Nathan Kingsbury and the office of the U.S. Attorney General. Under this commitment, AT&T agreed to divest itself of Western Union and provide long-distance services to independent phone exchanges.

During World War I, the government nationalized telephone and telegraph lines in the United States from June 1918 to July 1919, when, after a joint resolution of Congress, President Wilson issued an order putting them under the direction of the U.S. Post Office. A year later, the systems were returned to private ownership, AT&T resumed its monopolistic hold, and by 1934 the government again acted, this time agreeing to allow it to operate as a “regulated monopoly” under the jurisdiction of the FCC.

 

 

 

 

 

 

 

 

Public utility commissions in state and local jurisdictions were appointed regulators of AT&T and the nation’s independent phone companies, while the FCC regulated long-distance services conducted across state lines. They set the rates the phone companies could charge and determined what services and equipment each could offer. This stayed in effect until AT&T’s forced divestiture in 1984, the conclusion of a U.S. Department of Justice anti-trust suit that had been filed in 1974. The all-powerful company had become popularly known and disparaged as “Ma Bell.” AT&T’s local operations were divided into seven independent Regional Bell Operating Companies, known as the “Baby Bells.” AT&T became a long-distance-services company.

By 1948, the 30 millionth phone was connected in the United States; by the 1960s, there were more than 80 million phone hookups in the U.S. and 160 million in the world; by 1980, there were more than 175 million telephone subscriber lines in the U.S. In 1993, the first digital cellular network went online in Orlando, Florida; by 1995 there were 25 million cellular phone subscribers, and that number exploded at the turn of the century, with digital cellular phone service expected to replace land-line phones for most U.S. customers by as early as 2010.

World Changes Due to the Telephone

Within 50 years of its invention, the telephone had become an indispensable tool in the United States. In the late 19th century, people raved about the telephone’s positive aspects and ranted about what they anticipated would be negatives. Their key points, recorded by Ithiel de Sola Pool in his 1983 book “Forecasting the Telephone,” mirror nearly precisely what was later predicted about the impact of the internet.

For example, people said the telephone would: help further democracy; be a tool for grassroots organizers; lead to additional advances in networked communications; allow social decentralization, resulting in a movement out of cities and more flexible work arrangements; change marketing and politics; alter the ways in which wars are fought; cause the postal service to lose business; open up new job opportunities; allow more public feedback; make the world smaller, increasing contact between peoples of all nations and thus fostering world peace; increase crime and aid criminals; be an aid for physicians, police, fire, and emergency workers; be a valuable tool for journalists; bring people closer together, decreasing loneliness and building new communities; inspire a decline in the art of writing; have an impact on language patterns and introduce new words; and someday lead to an advanced form of the transmission of intelligence.

 

 

 

 

 

 

 

 

Privacy was also a major concern. As is the case with the Internet, the telephone worked to improve privacy while simultaneously leaving people open to invasions of their privacy. In the beginning days of the telephone, people would often have to journey to the local general store or some other central point to be able to make and receive calls. Most homes weren’t wired together, and eavesdroppers could hear you conduct your personal business as you used a public phone. Switchboard operators who connected the calls would also regularly invade people’s privacy. The early house-to-house phone systems were often “party lines” on which a number of families would receive calls, and others were free to listen in and often chose to do so.

Today, while most homes are wired and people can travel freely, conducting their phone conversations wirelessly, wiretapping and other surveillance methods can be utilized to listen in on their private business. People’s privacy can also be interrupted by unwanted phone calls from telemarketers and others who wish to profit in some way – just as Internet e-mail accounts receive unwanted sales pitches, known as “spam.”

Yet, the invention of the telephone also worked to increase privacy in many ways. It permitted people to exchange information without having to put it in writing, and a call on the phone came to replace such intrusions on domestic seclusion as unexpected visits from relatives or neighbors and the pushy patter of door-to-door salesmen. The same could be said for the Internet – privacy has been enhanced in some ways because e-mail and instant messaging have reduced the frequency of the jangling interruptions previously dished out by our telephones.

Past Predictions About the Future of the Telephone

President Rutherford B. Hayes to Alexander Graham Bell in 1876 on viewing the telephone for the first time:

“That’s an amazing invention, but who would ever want to use one of them?”

Bell offered to sell his telephone patent to Western Union for $100,000 in 1876, when he was struggling with the business. An account that is believed by some to be apocryphal, but still recounted in many telephone histories states that the committee appointed to investigate the offer filed the following report:

 

 

 

 

 

Telephone Advertisement

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

“We do not see that this device will be ever capable of sending recognizable speech over a distance of several miles. Messer Hubbard and Bell want to install one of their ‘telephone devices’ in every city. The idea is idiotic on the face of it. Furthermore, why would any person want to use this ungainly and impractical device when he can send a messenger to the telegraph office and have a clear written message sent to any large city in the United States? … Mr. G.G. Hubbard’s fanciful predictions, while they sound rosy, are based on wild-eyed imagination and lack of understanding of the technical and economic facts of the situation, and a posture of ignoring the obvious limitations of his device, which is hardly more than a toy … This device is inherently of no use to us. We do not recommend its purchase.”

As reported in the book “Bell” by Robert V. Bruce, Kate Field, a British reporter who knew Bell, predicted in 1878 that eventually:

“While two persons, hundreds of miles apart, are talking together, they will actually see each other.”

Sir William Preece, chief engineer for the British Post Office, 1878, as reported in “The Telephone in a Changing World” by Marion May Dilts:

“There are conditions in America which necessitate the use of such instruments more than here. Here we have a superabundance of messengers, errand boys and things of that kind … The absence of servants has compelled America to adopt communications systems for domestic purposes.”

AT&T chief engineer and Electrical Review writer John J. Carty projected in his “Prophets Column” in 1891:

“A system of telephony without wires seems one of the interesting possibilities, and the distance on the earth through which it is possible to speak is theoretically limited only by the curvation of the earth.”

Carty also wrote:

“Someday we will build up a world telephone system, making necessary to all peoples the use of a common language or common understanding of languages, which will join all the people of the earth into one brotherhood. There will be heard throughout the earth a great voice coming out of the ether which will proclaim, ‘Peace on earth, good will towards men.’”

 

 

 

 

 

 

 

In the 1912 article “The Future Home Theatre” in The Independent, S.C. Gilfillan wrote:

“There are two mechanical contrivances … each of which bears in itself the power to revolutionize entertainment, doing for it what the printing press did for books. They are the talking motion picture and the electric vision apparatus with telephone. Either one will enable millions of people to see and hear the same performance simultaneously .. or successively from kinetoscope and phonographic records … These inventions will become cheap enough to be … in every home … You will have the home theatre of 1930, oh ye of little faith.”

 

 

 

 

 

 

 

The History of The First Telephone The History of The First Telephone The History of The First Telephone The History of The First Telephone The History of The First Telephone The History of The First Telephone The History of The First Telephone

The History of The First Concrete

0

The History of The First Concrete, structures were built by the Nabataea traders or Bedouins who occupied and controlled a series of oases and developed a small empire in the regions of southern Syria and northern Jordan in around 6500 BC.

 

 

 

Industrial History: Concrete Paving Before Ready Mix Trucks Were Developed  (McCormick)

 

 

 

 

There is no denying that concrete and the technology surrounding it has come a long way since its discovery and development. From the Great Pyramids at Giza to smart sensors for testing concrete temperature, maturity, etc., we’ve put together a list of notable events and discoveries in the history of concrete.

6500BC – UAE: The earliest recordings of concrete structures date back to 6500BC by the Nabataea traders in regions of Syria and Jordan. They created concrete floors, housing structures, and underground cisterns.

3000 BC – Egypt and China: Egyptians used mud mixed with straw to bind dried bricks. They also used gypsum mortars and mortars of lime in the pyramids. The Great Pyramids at Giza used about 500,000 tons of mortar. A form of cement was also used to build the Great Wall of China around this time.

600 BC – Rome: Although the Ancient Romans weren’t the first to create concrete, they were first to utilize this material widespread. By 200 BC, the Romans successfully implemented the use of concrete in the majority of their construction. They used a mixture of volcanic ash, lime, and seawater to form the mix. They then packed the mix into wooden forms, and once hardened, stacked the blocks like brick. After more than 2,000 years, Roman concrete structures stand tall due to their ingredients colliding with Earth’s natural chemistry.

 

 

 

 

 

 

 

 

Technological Milestones: during the Middle Ages, concrete technology crept backward. After the fall of the Roman Empire in 476 AD, the technique for making pozzolan cement was lost until the discovery of manuscripts describing it was found in 1414. This rekindled interest in building with concrete.

It wasn’t until 1793 that the technology took a big leap forward when John Smeaton discovered a more modern method for producing hydraulic lime for cement. He used limestone containing clay that was fired until it turned into clinker, which was then ground into powder. He used this material in the historic rebuilding of the Eddystone Lighthouse in Cornwall, England.

In 1824 Joseph Aspdin invented Portland cement by burning finely ground chalk and clay until the carbon dioxide was removed. Aspdin named the cement after the high-quality building stones quarried in Portland, England.

In the 19th Century concrete was used mainly for industrial buildings. The first widespread use of Portland cement in home construction was in England and France between 1850 and 1880 by Francois Coignet, who added steel rods to prevent exterior walls from spreading.

The History of The First Cement

0

The History of The First Cement, has been in use by humans throughout history; variations of the material were used up to 12,000 years ago, with the earliest archaeological discovery of consolidated whitewashed floor made from burned limestone and clay found in modern-day Turkey. The first fired clay bricks were developed in the so-called Fertile Crescent, where it was discovered that lime could be produced from burnt limestone to prepare mortar. Around 800BC, the Phoenicians used the knowledge that a mixture of burnt lime and volcanic ash, today called ‘pozzolana’, could be used to produce hydraulic lime, that was not only stronger than anything previously used, but also hardened under water.

 

 

 

 

History of Portland Cement in the United States

 

 

 

The Romans developed new masonry techniques, with which they could erect grand buildings with heavy foundations. One such development was “opus caementitium”, a type of concrete made of lime with aggregates of sand and crushed rock. This was mostly used between masonry stones or bricks, serving as formwork. Other cements used crushed brick, tiles and ceramic pottery as aggregates. The Roman architect and engineer Marcus Vitruvius Polllio comprehensively described the knowledge and construction techniques of the time, which went on to to serve as the basis of building methods for hundreds of years.  

Famous historical buildings made from concrete, still standing today, are the Colosseum and Pantheon in Rome, and the Hagia Sophia in Istanbul.

 

The Middle Ages

The Middle Ages were a quiet time in the history of cement; any discoveries made during this era remain unknown, although masons are known to have used hydraulic cements to build structures such as fortresses and canals.

In the guilds of the Middle Ages, knowledge was a secret and was pased on to students orally, not written down, whilst alchemists researched properties and reactivity of substances, often using coded language. Typical mortars from this time consisted of lime and sand – concrete as we know it did not yet exist. 

The Industrial Revolution in Europe in the late 18th century saw a flurry of new developments in cement and concrete, with important contributions made by John Smeaton, who discovered that the hydraulicity of lime was directly related to the limestone’s clay content, James Parker, Louis Vicat and Egor Cheliev.

 

 

 

 

 

 

The Birth of Portland Cement:

The precursor to modern-day cement was created in 1824 by Joseph Aspdin, a British bricklayer and builder, who experimented with heating limestone and clay until the mixture calcined, grinding it and then mixing it with water. Aspdin named this Portland Cement, after the famously strong building stone from the Isle of Portland in Dorset, UK. His son, William Aspdin, made the first cement containing alite (an impure form of tricalcium silicate).

In 1845, Isaac Johnson fired chalk and clay at much higher temperatures than the Aspdins, at around 1400-1500oC, which led to the mixture clinkering, and produced what is essentially modern-day cement.

From 1850, the use of concrete made from Portland cement increased considerably. Projects such as sculptures, small bridges and concrete pipes were typical applications at the time and helped to increase its prominence. Then followed large scale sewage systems, such as in London and Paris, and the construction of metros and subways boosted demand. By the end of the 19th century, hollow concrete blocks for housing construction became mainstream. 

The advent of reinforced concretes began in the 1840s in France, starting a period of innovation, using reinforced columns, girders and so on to allow the construction of larger bridges, taller and larger buildings etc, and significantly decreased the dominance of steel construction. 

 

 

 

 

 

 

The first cement standard for Portland cement was approved in Germany in 1878, defining the first test methods and minimum properties, with many other countries following suit. 

Cement production and applications surged globally at the turn of the century. Since the 1900s, rotary kilns replaced the original vertical shaft kilns, as they use radiative heat transfer, more efficient at higher temperatures. achieving a uniform clinkering temperature and produces stronger cement. Gypsum is now also added to the resulting mixture to control setting and ball mills are used to grind clinker.

Other developments in the last century include calcium aluminate cements for better sulphate resistance, the blending of Rosendale (a natural hydraulic cement produced in New York) and Portland cements to make a durable and fast-setting cement in the USA, and the increased usage of cementitious materials to store nuclear waste.

The Future of Cement and Concrete

New technologies and innovations are constantly emerging to improve the sustainability, strength and applications of cement and concrete. Some advanced products incorporate fibres and special aggregates to create roof tiles and countertops, for example, whilst offsite manufacture is also gaining prominence with the rise of digitalisation and AI, which could reduce waste and improve efficiency and on-site working conditions. Cements and concretes are also being developed which can absorb CO2 over their lifetimes, reducing the carbon footprint of the building material.

Cement as we know it was first developed by Joseph Aspdin, an enterprising 19th-century British stonemason, who heated a mix of ground limestone and clay in his kitchen stove, then pulverized the concoction into a fine powder.

The result was the world’s first hydraulic cement: one that hardens when water is added. Aspdin dubbed his creation Portland cement due to its similarity to a stone quarried on the Isle of Portland, off the British coast. In 1824, this brilliant craftsman obtained a patent for what would prove to be the world’s most ubiquitous building material, laying the foundation for today’s global Portland cement industry.


Manufacturing Process

Portland cement – a combination of calcium, silica, aluminum and iron – is the fundamental ingredient in concrete.

Producing a calcium-silicate Portland cement that conforms to specific chemical and physical specifications demands careful control of the manufacturing process.

First, the raw materials – limestone, shells or chalk along with shale, clay, sand or iron ore – are mined from a quarry that’s usually near the manufacturing plant. Before leaving the quarry these materials are reduced in size by two sets of crushers. The primary set crushes the stone to about five inches (125 mm) in diameter and the secondary set pulverizes it to just 3/4 inch (19 mm). Then the raw materials are sent to the manufacturing plant, where they are proportioned to create cements with specific chemical compositions.

Portland cement is manufactured using two methods: wet and dry.

In the dry method, dry raw materials are proportioned before being ground into a fine powder, blended, then fed dry into a kiln.

In the wet method, a slurry is created by adding water to properly proportioned raw materials prior to them being ground, blended and fed into the upper end of a tilted and rotating cylindrical kiln, where their rate of passage is controlled by the kiln’s slope and rotational speed.

Burning fuel – usually powdered coal or natural gas – is then forced into the kiln’s lower end, heating the raw materials to 2,600-3,000 degrees F (1,430-1,650 degrees C). At 2,700 degrees F (1,480 degrees C), several chemical reactions fuse the raw materials, creating what are called cement clinkers: grayish-black pellets the size of marbles.

The red-hot clinkers are discharged from the lower end of the kiln and transferred into various types of coolers to reduce their temperature so they can be handled safely. Now cooled, the clinkers are combined with gypsum and ground into a gray powder so fine that it can pass through a 75-micron – or number 200 mesh – sieve.

This fine gray powder is Portland cement.


Types of Portland Cement

The flexibility of Portland cement is evident in the different types, which are manufactured to meet various physical and chemical requirements.

The American Society for Testing and Materials (ASTM) Specification C-150 provides for eight individual types of Portland cement.

  • Type I – For use when the special properties specified for any other type are not required.
  • Type IA – Air-entraining cement for the same uses as Type I, where air-entrainment is desired.
  • Type II – For general use, more especially when moderate sulfate resistance is desired.
  • Type IIA – Air-entraining cement for the same uses as Type II, where air-entrainment is desired.
  • Type II(MH) – For general use, more especially when moderate heat of hydration and moderate sulfate resistance are desired.
  • Type II(MH)A – Air-entraining cement for the same uses as Type II(MH), where air-entrainment is desired.
  • Type III – For use when high early strength is desired.
  • Type IIIA – Air-entraining cement for the same use as Type III, where air-entrainment is desired.
  • Type IV – For use when a low heat of hydration is desired.
  • Type V – For use when high sulfate resistance is desired.

White Portland Cement

When architectural considerations require white or colored concrete or mortar, Portland cement can adapt with the manufacture of white Portland cement, just one of a number of special-purpose hydraulic cement types available.

White Portland cement is identical in composition to the traditional gray-colored product, except in color. This is made possible during the manufacturing process by selecting raw materials containing only negligible amounts of the iron and magnesium oxides that give Portland cement its gray color.


Blended Hydraulic Cements

Blended hydraulic cements, designed to conform to the special requirements of the ASTM C595 or C1157 standards, are produced by mixing Portland cement, ground and granulated blast-furnace slag, fly ash, natural pozzolans and silica fume. These cements may also designed as air-entraining, moderate sulfate-resistant or with moderate or low heat of hydration, depending on the need.


ASTM C1157-compliant cements include:

 

  • Type GU – blended hydraulic cement for general construction.
  • Type HE – high-early-strength cement.
  • Type MS – moderate sulfate-resistant cement.
  • Type HS – high sulfate-resistant cement.
  • Type MH – moderate heat of hydration cement.
  • Type LH – low heat of hydration cement.

 

 

 

 

 

 

 

 

The ASTM C1157-compliant cements can also be designated for low reactivity (option R) with alkali-reactive aggregates. There are no restrictions on the composition of C1157 cements. Manufacturers can optimize ingredients, such as pozzolans and slags, to achieve a particular set of concrete properties.

Of all the blended cements available throughout the world, Types IP and IS are the most common. While Europe and Asia currently use more blended cements than the United States, environmental and energy concerns, in addition to consumer demand for cements with specific properties, may alter this situation.

 

 

 

 

 

The History of The First Cement The History of The First Cement The History of The First Cement The History of The First Cement The History of The First Cement

The History of The First Fan

0

The History of The First Fan, was created by Dr. Schuyler Skaats Wheeler in 1886. It was a small, two-blade personal desk fan that was DC powered. Made of brass and loved by all who worked inside in the summer, the fan was modern, effective, and dangerous as there was no cage surrounding the blades. Before the late 1800s, being too hot was an everyday problem. Unlike being too cold (where you can build a fire or add layers of clothing), there were minimal options for overcoming the heat. Once electricity was invented, it didn’t take long for inventers to start working on ideas for the electronic fan.The first electric fan was created by Dr. Schuyler Skaats Wheeler in 1886. It was a small, two-blade personal desk fan that was DC powered. Made of brass and loved by all who worked inside in the summer, the fan was modern, effective, and dangerous as there was no cage surrounding the blades. AC motors, produced in the 1890s, became more common and replaced DC versions (Challoner, 428).

 

 

 

Evolution of Electric Fan 1882-2020 - YouTube

 

 

 

 

Fans for the Common Folk

Like most inventions, the newest technology is often expensive. This meant that only the most affluent could afford them. Changes began in the 1920s when mass production of steel fan blades reduced prices. The designs also began to change when GE develop an overlapping blade design that was much quieter than previous models. Variations in color and style appealed to consumers, and electric fans became common appliances in most households.

 

 

 

 

 

Modern Electric Fans

Today’s electric fans are still a very popular solution to overcome the heat. Portable fans, whether it be the personal desk fan or a small battery powered fan that you can carry on hot summer days, make it convenient to keep cool.

Ceiling fans are also an effective method for circulating air in the summer, but can also be used to circulate warm air in the winter by changing the direction of the blade. Fans without visible blades are the newest innovation in air circulation.

Electric fans are a prevalent part of our society just as The History of The First Fan Keystone Electronics Corporation is a prevalent part of the fan’s ability to function properly. Our electronic components keep fan blades spinning to help keep you cool. Keystone’s line of components includes quick fit terminals, battery & coin cell holders, and contacts & clips to  keep fans functioning at optimum performance.

The History of The First Watch

0

The History of The First Watch, Then it happened at the beginning of the 16th century. The German Peter Henlein from Nuremberg made the first portable clock with a spring mechanism. This is seen as the first watch. People also called these watches the Nuremberg eggs, because of the oval enclosure.

 

 

 

History of watches - Wikipedia

 

 

Who invented the watch and why? For the emergence of the watch we have to go several hundred years back in time. Back to the 15th century which began the history of the watch.

Explorers needed the time to navigate at sea. The latitude could already be accurately determined on the basis of the stars, but for the longitude was the local time needed. If the clock was one minute wrong, that was already a deviation of 28 km in the tropics. So the origin of the watch stemmed from the need for a precise navigation tool.

The history of the watch started with the so-called quadrans, a disc which could be determined on the basis of protractors what time it was. The first mechanical clocks which were accurate enough worked with a pendulum. This clock needed to hang quietly, so it was not suitable for sea or pocket.

Prisma_slimline_watch_the_world_map_watches-for-men_sun-ray-slimline

The origin of the watch

Then it happened at the beginning of the 16th century. The German Peter Henlein from Nuremberg made the first portable clock with a spring mechanism. This is seen as the first watch. People also called these watches the Nuremberg eggs, because of the oval enclosure. It was still an art to let the mechanical watch run accurately. The Swiss mechanics Jacob Zech and Gruet changed this with their innovation. They build a mechanism that provided great resistance when the spring was tightly wound.

Next step in the watch history

The gears were made of copper around 1550. This was a nice improvement, because the first parts were faulty. Copper is finer editing, making the clock more accurate. As you can imagine, this was a very expensive process involving many needlework. Therefore a watch was already a status symbol back then.

goed horloge betaalbaar quality watch betaalbaar horloge prisma kwaliteit nederlands horlogemerk

History of watches: The wristwatch

The next step in the history of the watch was made in 1675 by Englishman Robert Hooke and Dutchman Christiaan Huygens. They created separate from each other a structure where the spring ran balance wheel. Due to different knowledge sharing and many needlework, the watch was far enough developed in 1762, that it was accurate enough for navigation.

The wristwatch was invented at the end of the 19th century and mainly seen as a woman’s jewellery. The First World War had a profound influence on the history of the watch. Soldiers started to wear watches on the wrist, so they could read time more quickly in the heat of the battle. With this came also the demand for quality watches.

The first quartz watch was just created mid-19th century. This watch, powered by battery, almost meant the demise of the traditional watchmakers. Because the quartz watch is a lot cheaper than a mechanical watch.

The History of The First Hygrometer

0

The History of The First Hygrometer, is an instrument used to measure the moisture content – that is, the humidity – of air or any other gas. The hygrometer is a device that has had many incarnations. Leonardo da Vinci built the first crude hygrometer in the 1400s. Francesco Folli invented a more practical hygrometer in 1664. In 1783, Swiss physicist and geologist, Horace Bénédict de Saussure built the first hygrometer using a human hair to measure humidity.These are called mechanical hygrometers, based on the principle that organic substances (human hair) contract and expand in response to the relative humidity. The contraction and expansion move a needle gauge.

What is a hygrometer? - Meawow

Dry and Wet-Bulb Psychrometer

The best-known type of hygrometer is the “dry and wet-bulb psychrometer”, best described as two mercury thermometers, one with a wetted base, one with a dry base. The water from the wet base evaporates and absorbs heat, causing the thermometer reading to drop. Using a calculation table, the reading from the dry thermometer and the reading drop from the wet thermometer are used to determine the relative humidity. While the term “psychrometer” was coined by a German Ernst Ferdinand August, 19th-century physicist Sir John Leslie (1776-1832) is often credited with actually inventing the device. 

Some hygrometers use the measurements of changes in electrical resistance, using a thin piece of lithium chloride or other semiconductive material and measuring the resistance, which is affected by humidity.

Other Hygrometer Inventors

Robert Hooke: A 17th century contemporary of Sir Isaac Newton invented or improved a number of meteorological instruments such as the barometer and the anemometer. His hygrometer, regarded as the first mechanical hygrometer, used the husk of oat grain, which he noted curled and uncurled depending on the humidity of the air. Hooke’s other inventions include the universal joint, The History of The First Hygrometer an early prototype of the respirator, the anchor escapement and the balance spring, which made more accurate clocks possible. Most famously, however, he was the first to discover cells. 

John Frederic Daniell: In 1820, British chemist and meteorologist, John Frederic invented a dew-point hygrometer, which came into widespread use to measure the temperature at which moist air reaches a saturation point. Daniel is best known for inventing the Daniell cell, an improvement over the voltaic cell used in the early history of battery development.

The History of The First Clock

0
The first model clock was built in 1657 in the Hague, but it was in England that the idea was taken up. The History of The First Clock The longcase clock (also known as the grandfather clock) was created to house the pendulum and works by the English clockmaker William Clement in 1670 or 1671.
Who Invented The Clock? - YouTube

The history of timekeeping devices dates back to when ancient civilizations first observed astronomical bodies as they moved across the sky. Devices and methods for keeping time have since then improved through a long series of new inventions and ideas. Sundials and water clocks originated from ancient Egypt, and were later used by the Babylonians, the Greeks and the Chinese; medieval Islamic water clocks were unrivalled in their sophistication until the mid-14th century. Incense clocks, which may have been invented in India, were being used in China by the 6th century. The hourglass, one of the few reliable methods of measuring time at sea, was a European invention and does not seem to have been used in China before the mid-16th century.

In medieval Europe, purely mechanical clocks were developed after the invention of the bell-striking alarm, used to warn a man to toll the monastic bell. The weight-driven mechanical clock, controlled by the action of a verge and foliot, was a synthesis of earlier ideas derived from European and Islamic science, and one of the most important inventions in the history of the timekeeping. The most famous mechanical clock was designed and built by Henry de Vick in c.1360—for the next 300 years, all the improvements in timekeeping were essentially developments based on it. The invention of the mainspring in the early 15th century allowed small clocks to be built for the first time.

 

 

 

 

 

 

 

 

From the 17th century, the discovery that clocks could be controlled by harmonic oscillators led to the most productive era in the history of timekeeping. Leonardo da Vinci had produced the earliest known drawings of a pendulum in 1493–1494, and in 1582 Galileo Galilei had investigated the regular swing of the pendulum, discovering that frequency was only dependent on length. The pendulum clock, designed and built by Dutch polymath Christiaan Huygens in 1656, was so much more accurate than other kinds of mechanical timekeepers that few clocks have survived with their verge and foliot mechanisms intact. Other innovations in timekeeping during this period include inventions for striking clocks, the repeating clock and the deadbeat escapement. Errors in early pendulum clocks were eclipsed by those caused by temperature variation, a problem tackled during the 18th century by the English clockmakers John Harrison and George Graham; only the invention of invar in 1895 eliminated the need for such innovations.

From the 18th century, a succession of innovations and inventions led to timekeeping devices becoming increasingly accurate. Following the Scilly naval disaster of 1707, after which governments offered a prize to anyone who could discover a way to determine longitude, Harrison built a succession of accurate timepieces. The electric clock, invented in 1840, was used to control the most accurate pendulum clocks until the 1940s, when quartz timers became the basis for the precise measurement of time and frequency. The wristwatch, which had been recognised as a valuable military tool during the Boer War, became a symbol of masculinity and bravado after World War I. During the 20th century the non-magnetic wristwatch, battery-driven watches, the quartz wristwatch, and transistors and plastic parts were all invented. The most accurate timekeeping devices in practical use today are atomic clocks, which can be accurate to within a few billionths of a second per year. They are used to calibrate other clocks and timekeeping instruments.

 

 

 

 

clock or a timepieceis a device used to measure and indicate time. The clock is one of the oldest human inventions, meeting the need to measure intervals of time shorter than the natural units: the day, the lunar month, year and galactic year. Devices operating on several physical processes have been used over the millennia.

Some predecessors to the modern clock may be considered as “clocks” that are based on movement in nature: A sundial shows the time by displaying the position of a shadow on a flat surface. There is a range of duration timers, a well-known example being the hourglass. Water clocks, along with the sundials, are possibly the oldest time-measuring instruments. A major advance occurred with the invention of the verge escapement, which made possible the first mechanical clocks around 1300 in Europe, which kept time with oscillating timekeepers like balance wheels.

 

 

 

 

 

 

 

Traditionally, in horology, the term clock was used for a striking clock, while a clock that did not strike the hours audibly was called a timepiece. This distinction is no longer made. Watches and other timepieces that can be carried on one’s person are usually not referred to as clocks. Spring-driven clocks appeared during the 15th century. During the 15th and 16th centuries, clockmaking flourished. The next development in accuracy occurred after 1656 with the invention of the pendulum clock by Christiaan Huygens. A major stimulus to improving the accuracy and reliability of clocks was the importance of precise time-keeping for navigation. The mechanism of a timepiece with a series of gears driven by a spring or weights is referred to as clockwork; the term is used by extension for a similar mechanism not used in a timepiece. The electric clock was patented in 1840, and electronic clocks were introduced in the 20th century, becoming widespread with the development of small battery-powered semiconductor devices.

The timekeeping element in every modern clock is a harmonic oscillator, a physical object (resonator) that vibrates or oscillates at a particular frequency. This object can be a pendulum, a tuning fork, a quartz crystal, or the vibration of electrons in atoms as they emit microwaves.

 

 

 

 

 

 

 

Clocks have different ways of displaying the time. The History of The First Clock Analog clocks indicate time with a traditional clock face, with moving hands. Digital clocks display a numeric representation of time. Two numbering systems are in use: 24-hour time notation and 12-hour notation. Most digital clocks use electronic mechanisms and LCD, LED, or VFD displays. For the blind and for use over telephones, speaking clocks state the time audibly in words. The History of The First Clock There are also clocks for the blind that have displays that can be read by touch. The study of timekeeping is known as horology.

The History of The First Radio

0

The History of The first Radio, In modern society, radios are common technology in the car and in the home. In fact, in today’s world one would be hard pressed to find anyone who has not heard of, seen, or used a radio during his or her life, regardless of how old or young they may be. This was not always the case, however. Before the 19th century, wireless radio communication in everyday life was a thing of fantasy. Even after the development of the radio in the late 1800s, it took many years before radios went mainstream and became a household fixture. The history of the radio is a fascinating one that changed how the world connected and communicated from distances both far and near.

 

 

 

 

Marconi's first radio broadcast made 125 years ago - BBC News

 

 

 

While the radio enjoys a long and interesting history, its earliest beginnings are still quite controversial. There’s some debate as to who actually invented the radio. While we may not know with certainty who put together the first radio device, we do know that in 1893 the inventor Nikolai Tesla demonstrated a wireless radio in St. Louis, Missouri. Despite this demonstration, Guglielmo Marconi is the person most often credited as the father and inventor of the radio. It was Marconi that was awarded the very first wireless telegraphy patent in England in the year 1896, securing his spot in radio’s history. A year later, however, Tesla filed for patents for his basic radio in the United States. His patent request was granted in 1900, four full years after Marconi’s patent was awarded. Regardless of who created the very first radio, on December 12, 1901, Marconi’s place in history was forever sealed when he became the first person to transmit signals across the Atlantic Ocean.

Before and During World War I

Prior to the 1920s, the radio was primarily used to contact ships that were out at sea. Radio communications were not very clear, so operators typically relied on the use of Morse code messages. This was of great benefit to vessels in the water, particularly during emergency situations. With World War I, the importance of the radio became apparent and its usefulness increased significantly. During the war, the military used it almost exclusively and it became an invaluable tool in sending and receiving messages to the armed forces in real time, without the need for a physical messenger.

 

 

 

 

 

 

 

Radio and the 1920s

In the 1920s, following the war, civilians began to purchase radios for private use. Across the U.S. and Europe, broadcasting stations such as KDKA in Pittsburgh, Pennsylvania and England’s British Broadcasting Company (BBC) began to surface. In 1920, the Westinghouse Company applied for and received a commercial radio license which allowed for the creation of KDKA. KDKA would then become the first radio station officially licensed by the government. It was also Westinghouse which first began advertising the sale of radios to the public. While manufactured radios were finding their way into the mainstream, home-built radio receivers were a solution for some households. This began to create a problem for the manufacturers who were selling pre-made units. As a result, the Radio Corporation Agreements, RCA, was sanctioned by the government. Under RCA, certain companies could make receivers, while other companies were approved to make transmitters. Only one company, AT&T, was able to toll and chain broadcast. It was AT&T that, in 1923, released the first radio advertisement. In the late 20s, CBS and NBC were created in response to AT&T being the sole station with rights to toll broadcasting.

In Britain, radio broadcasts began in 1922 with the British Broadcasting Company, or BBC, in London. The broadcasts quickly spread across the UK but failed to usurp newspapers until 1926 when the newspapers went on strike. At this point the radio and the BBC became the leading source of information for the public. In both the U.S. and the U.K. it also became a source of entertainment in which gathering in front of the radio as a family became a common occurrence in many households.

 

 

 

 

 

 

World War II and Changes Following the War

During World War II, the radio once again fulfilled an important role for both the U.S. and the U.K. With the help of journalists, radio relayed news of the war to the public. It was also a rallying source and was used by the government to gain public support for the war. In the U.K. it became the primary source of information after the shut-down of television stations. The way in which radio was used also changed the world after World War II. While radio had previously been a source of entertainment in the form of serial programs, after the war it began to focus more on playing the music of the time. The “Top-40” in music became popular during this period and the target audience went from families to pre-teens up to adults in their mid-thirties. Music and radio continued to rise in popularity until they became synonymous with one another. FM radio stations began to overtake the original AM stations, and new forms of music, such as rock and roll, began to emerge.

 

 

 

 

 

The Present and Future of Radio

Today, radio has become much more than Tesla or Marconi could have ever imagined. Traditional radios and radio broadcasting have become a thing of the past. Instead, radio has steadily evolved to keep up with current technology, with satellite and streaming internet stations gaining popularity. Radios are found not only in homes, but they are also a staple in vehicles. In addition to music, radio talk shows have also become a popular option for many. On the two-way radios front, newer digital two-way radios allow for one-to-one communication that is typically encrypted for improved security. Short-range radios have improved communications at worksites and handheld radios have become essential in sports, television production and even commercial airline operations.

History of the First Television

0

History of the First Television, can be found in billions of homes around the world. But 100 years ago, nobody even knew what a television was. In fact, as late as 1947, only a few thousand Americans owned televisions. How did such a groundbreaking technology turn from a niche invention to a living room mainstay, we’re explaining the complete history of the television – including where it could be going in the future.

 

 

 

Origins of Modern Tech: Televisions — then & now | by Veeran Rajendiran |  Medium

 

 

 

Mechanical Televisions in the 1800s and Early 1900s

Prior to electric televisions, we had mechanical televisions.

These early televisions started appearing in the early 1800s. They involved mechanically scanning images then transmitting those images onto a screen. Compared to electronic televisions, they were extremely rudimentary.

One of the first mechanical televisions used a rotating disk with holes arranged in a spiral pattern. This device was created independently by two inventors: Scottish inventor John Logie Baird and American inventor Charles Francis Jenkins. Both devices were invented in the early 1920s.

Prior to these two inventors, German inventor Paul Gottlieb Nipkow had developed the first mechanical television. That device sent images through wires using a rotating metal disk. Instead of calling the device a television, however, Nipkow called it an “electric telescope”. The device had 18 lines of resolution.

In 1907, two inventors – Russian Boris Rosing and English A.A. Campbell-Swinton – combined a cathode ray tube with a mechanical scanning system to create a totally new television system.

Ultimately, the early efforts of these inventors would lead to the world’s first electrical television a few years later.

The First Electronic Television was Invented in 1927

The world’s first electronic television was created by a 21 year old inventor named Philo Taylor Farnsworth. That inventor lived in a house without electricity until he was age 14. Starting in high school, he began to think of a system that could capture moving images, transform those images into code, then move those images along radio waves to different devices.

Farnsworth was miles ahead of any mechanical television system invented to-date. Farnsworth’s system captured moving images using a beam of electrons (basically, a primitive camera).

The first image ever transmitted by television was a simple line. Later, Farnsworth would famously transmit a dollar sign using his television after a prospective investor asked “When are we going to see some dollars in this thing, Farnsworth?”

Between 1926 and 1931, mechanical television inventors continued to tweak and test their creations. However, they were all doomed to be obsolete in comparison to modern electrical televisions: by 1934, all TVs had been converted into the electronic system.

Understandably, all early television systems transmitted footage in black and white. Color TV, however, was first theorized way back in 1904 – something we’ll talk about later on.

How Did Early Televisions Work?

The two types of televisions listed above, mechanical and electronic, worked in vastly different ways. We’ve hinted at how these TVs worked above, but we’ll go into a more detailed description in this section.

Mechanical Televisions

Mechanical televisions relied on rotating disks to transmit images from a transmitter to the receiver. Both the transmitter and receiver had rotating disks. The disks had holes in them spaced around the disk, with each hole being slightly lower than the other.

To transmit images, you had to place a camera in a totally dark room, then place a very bright light behind the disk. That disk would be turned by a motor in order to make one revolution for every frame of the TV picture.

Baird’s early mechanical television had 30 holes and rotated 12.5 times per second. There was a lens in front of the disk to focus light onto the subject.

When light hit the subject, that light would be reflected into a photoelectric cell, which then converted this light energy to electrical impulses. The electrical impulses are transmitted over the air to a receiver. The disk on that receiver would spin at the exact same speed as the disk on the transmitter’s camera (the motors would be synchronized to ensure precise transmissions).

The receiving end featured a radio receiver, which received the transmissions and connected them to a neon lamp placed behind the disk. The disk would rotate while the lamp would put out light in proportion to the electrical signal it was getting from the receiver.

Ultimately, this system would allow you to view the image on the other side of the disk – although you’d need a magnifying glass. Here’s how the system works in diagram form:

 

 

 

 

 

 

 

 

 

Electronic Televisions

There’s a reason we stopped using mechanical televisions: electronic televisions were vastly superior.

Electronic televisions rely on a technology called a Cathode Ray Tube (CRT) as well as two or more anodes. The anodes were the positive terminals and the cathode was the negative terminal.

The “Cathode” part of the Cathode Ray Tube was a heated filament enclosed in a glass Tube (the “T” of CRT). The Cathode would release a beam of electronics into the empty space of the tube (which was actually a vacuum).

All of these released electrons had a negative charge and would thus be attracted to positively charged anodes. These anodes were found at the end of the CRT, which was the television screen. As the electrons were released at one end, they were displayed on the television screen at the other end.

Of course, firing electrons against a glass screen doesn’t make images. To make images, the inside of the television screen would be coated with phosphor. The electrons would paint an image on the screen one line at a time.

To control the firing of electrons, CRTs use two “steering coils”. Both steering coils use the power of magnets to push the electron beam to the desired location on the screen. One steering coil pushes the electrons up or down, while the other pushes them left or right.

The First Television Stations in America

The world’s first television stations first started appearing in America in the late 1920s and early 1930s.

The first mechanical TV station was called W3XK and was created by Charles Francis Jenkins (one of the inventors of the mechanical television). That TV station aired its first broadcast on July 2, 1928.

One of the world’s first television stations, WRGB, has the honor of being the world’s only continuously operating station since 1926 to the modern day.

 

 

 

 

 

 

 

 

The First Television Sets in America

America’s first commercially produced television sets were based on the mechanical television system – made by John Baird’s television designs. These sets were shown off to the public in September, 1928.

It would take until 1938, however, before American electronic television sets were produced and released commercially. They were an instant hit after release.

The First Remote Control for Television Sets

The world’s first television remote control was called the Tele Zoom, and it can barely even be categorized as a remote control. The Tele Zoom was only used to “zoom in” to the picture on the television. You could not use it to change any channels or turn the TV on or off. The Tele Zoom was released in 1948.

The first “true” remote control was produced by Zenith and released in 1955. This remote control could turn the television on or off and change the channel. It was also completely wireless.

The First Television Program in America

Today, American networks play thousands of different programs every day. Every single one of these programs, however, owes its existence to America’s first television program, which was called The Queen’s Messenger. That program was first shown in 1928 by WRGB station.

We’re not 100% sure that The Queen’s Messenger was the first TV program shown in America. In 1928, the program was thought to be broadcast only to four television sets. Not 400. Not 4,000. Four. Thus, we have some ambiguity and debate over whether this was actually the first television program.

America’s First Television Commercial

The first television station in America started broadcasting in 1928. For the first 13 years of its existence, television remained blissfully commercial-free. The first commercial broadcast in America did not take place until July 1, 1941, which is when the first American advertisement aired. The ad was for a Bulova watch and lasted for 10 seconds. It aired on NBC.

 

 

 

 

 

 

 

Color Television in America

Color television traces its roots as far back as 1904, when a German inventor received a patent for color television. However, that inventor did not actually have a working color television – it was just a patented idea.

A conceptualized color television system appeared in 1925 from inventor Vladimir Zworykin. However, this system was never converted into reality. All attempts to convert it into reality did not succeed.

Color television was placed on the backburner for about 20 years. In 1946, the idea of color television was renewed in earnest. As TheHistoryOfTelevision.com explains,

“By 1946, the Second World War was history, and people in America wanted to make up for all the time lost to the war. Black and white television was thought of as old and it was time to do something new. This is when color television systems first began to be considered seriously.”

The color television war in America was fought between two industry giants: CBS and RCA. CBS was the first company to create a color television set. However, the main drawback was that it was a mechanical television based on John Baird’s original system. Thus, it was not compatible with black and white TV sets in use across America.

Despite this major flaw, the FCC declared that the CBS color television was going to be the national standard.

RCA protested, stating that it was unfair to make CBS color TV the standard when it could even be used by millions of customers across America (most of whom owned RCA televisions).

Unfazed, RCA continued to develop their own color television system that would be compatible with its customers RCA sets. In 1953, the FCC acknowledged that RCA’s color TV system was better. Starting in 1954, color RCA TV systems were sold across America.

Color TV had a similar initial problem as 3D TV and other technologies: people owned the color TV technology, but broadcasters weren’t producing color TV content. Few people owned color TV sets between 1954 and 1965. However, starting in 1966, color TV programming was broadcast across America, leading to a surge in sales of color television sets.

 

 

 

 

 

 

 

Timeline of TV History Between the 1950s and 2000s

Between the 1950s and 2000s, television turned from a niche technology into a critical form of communication found in living rooms across the nation. A vast number of changes and improvements took place in the second half of the 20th century to make the television into what it is today. Here’s a timeline:

  • 1949: In January, the number of TV stations had grown to 98 in 58 market areas.
  • 1949: The FCC adopted the Fairness Doctrine, which made broadcasters responsible for seeking out and presenting all sides of an issue when covering controversy. This act was a supplement to the Communications Act of 1934, which required broadcasters to give equal airtime to candidates running in elections.
  • 1951: I Love Lucy, sponsored by Philip Morris, was born. The half-hour sitcom ranked as the number one program in the nation for four of its first six full seasons.
  • 1951: On June 21, CBS broadcasted the first color program. As mentioned above, CBS’s color system only worked with a small number of TVs across America. Only 12 customers across America could see the first color TV broadcast. 12 million other TVs were blank for this program.
  • 1952: Bob Hope takes his comedy from radio to TV as The Bob Hope Show debuts in October, 1952.
  • 1952: By the end of 1952, TVs could be found in 20 million households across America, a rise of 33% from the previous year. U.S. advertisers spent a total of $288 million on television advertising time, an increase of 38.8% from 1951.
  • 1953: RCA releases its color broadcasting system, which worked on 12 million TVs instead of 12.

 

 

 

 

 

 

 

 

 

  • 1954: NBC launches The Tonight Show with comedian Steve Allen.
  • 1955: Gunsmoke, the classic western TV show, began its 20 year run on CBS.
  • 1958: 525 cable TV systems across America serve 450,000 subscribers. In response, CBS takes out a two page advertisement in TV Guide stating that “Free television as we know it cannot survive alongside pay television.”
  • 1960: Four debates between John F. Kennedy and Richard Nixon were broadcast throughout the year across the country, forever changing the way presidents would campaign.
  • 1963: For the first time in history, television surpasses newspapers as an information source. In a poll this year, 36% of Americans found TV to be a more reliable source than print, which was favored by 24%.
  • 1964: The FCC regulates cable for the first time. The FCC required operators to black out programming that comes in from distant markets and duplicates a local station’s own programming (if the local station demanded it).
  • 1964: 73 million viewers watch The Beatles appear on the Ed Sullivan Show.
  • 1965: NBC calls itself The Full Color Network and broadcasts 96% of its programming in color.
  • 1969: Astronaut Neil Armstrong walks on the moon for the first time as millions of American viewers watch live on network TV.
  • 1970: The FCC implements the Financial Interest Syndication Rules that prohibit the three major networks from owning and controlling the rebroadcast of private shows. This meant 30 minutes of programming each night were given back to local stations in the top 50 markets, encouraging the production of local programming.
  • 1971: Advertisements transition from 60 seconds in average length to 30 seconds.
  • 1979: Some people believe it’s the “beginning of the end for TV” as a poll indicated that 44% of Americans were unhappy with current programming and 49% were watching TV less than what they did a few years earlier.
  • 1979: ESPN, a network totally devoted to sports, debuts on cable. ESPN would go on to become the largest and most successful basic cable channel.
  • 1980: Ted Turner launches Cable News Network (CNN), a channel devoted to showcasing news 24 hours a day.
  • 1980: Music Television (MTV) makes its debut in August of 1980.
  • 1986: After years of rising rates, ABC, CBS, and NBC have trouble selling commercial time for sports programs for the first time. Commercial rates for the 1986 NFL season dropped 15% from the 1985 season.

 

 

 

 

 

 

 

  • 1989: Pay Per View begins to leave its mark on the television landscape, reaching about 20% of all wired households.
  • 1992: Infomercials explode with growth. This year, the National Infomercial Marketing Association estimates infomercials generate sales of $750 million, double that of 1988.
  • 1993: At the start of 1993, 98% of American households owned at least one TV, with 64% owning two or more sets.
  • 1996: Digital satellite dishes 18 inches in diameter hit the market, becoming the bestselling electronic item in history next to the VCR.
  • 2000: The Digital Video Disc (DVD) is introduced.
  • 2004: DVDs outsell VHS tapes for the first time.
  • 2005: Flat screen TVs and HDTVs are introduced for the first time.
  • 2006: Flat screen TVs and HDTVs become affordable for the first time.
  • 2006: Sony releases its Blu-ray disc format, capable of holding up to 27GB despite being the same size as a DVD.
  • 2010: 3D televisions start hitting the market, spurred by popular 3D blockbusters like Avatar.

 

 

 

 

History of the First Television History of the First Television History of the First Television History of the First Television History of the First Television History of the First Television History of the First Television

error: Content is protected !!