When site published, the content will be centered in the page
Vertical Divider
|
Vertical Divider
Issue 690 - 1/19/15
IT'S NO SCHEME, SNIPPETZ GETS THE ANGLE ON PYRAMIDS! by Lindsey Harrison “Death comes to all, but great achievements build a monument which shall endure until the sun grows cold.” – Ralph Waldo Emerson Of all the manmade objects in the world, it’s safe to say that there isn’t much mystery in how we built them. We can travel to a factory and watch a car being built. We can drive down the street and watch a house being constructed. But one type of object has always seemed to stump researchers and that’s the massive pyramids found in Egypt. Of course, there are other pyramids in other places around the world, but they aren’t nearly as popular or mystifying as the Great Pyramids of Giza. So what’s the big deal about these pyramids anyway? Sure, they’re huge and yes, they’ve been around for thousands of years. Okay, maybe that’s enough to be interesting, but Snippetz couldn’t just leave it at that. We had to go digging ourselves to see what we could unearth about pyramids around the globe! History of the pyramid In the beginning, royalty in ancient Egypt was buried in tombs that were carved into rock and covered with flat-roofed rectangular structures called “mastabas.” These structures date back to about 2950 B.C. and are considered the precursors to pyramids. Egypt’s oldest known pyramid dates back to 2630 B.C. at Saqqara and was built for the third dynasty’s King Djoser. The pyramid originated as a mastaba but it took on a life of its own and developed into what we now call a Step Pyramid. Supposedly, the pyramid’s architect was Imhotep, a priest and healer who would eventually come to be known as the patron saint of scribes and physicians. During Djoser’s almost 20-year reign, the pyramid builders created six stepped layers of stone, rather than the traditional mud-bricks used in earlier tombs. Eventually the layers reached a height of 204 feet, earning it the distinction as the tallest building of its time. The earliest tomb that was constructed as a smooth-sided (rather than step) pyramid was the Red Pyramid at Dahshur. It was one of three burial structures built for the first king of the fourth dynasty, Sneferu. The name originates from the color of the limestone blocks used to build the core of the pyramid. The Great Pyramids of Giza These pyramids are almost assuredly what come to mind when someone is referring to a pyramid. Can you really blame them? After all, the oldest and largest of the three pyramids at Giza (the Great Pyramid) is the only structure of the Seven Wonders of the Ancient World that is still standing today. It was built for Khufu, the second king of the fourth dynasty and Sneferu’s successor. It measures about 755 feet on each side at the base and its original height was 481 feet, officially earning it the title of largest pyramid in the world. Around Khufu’s pyramid stand three small pyramids, each built for one of his queens. In order to build the Great Pyramid, about 2.3 million blocks, weighing on average 2.5 tons each, had to be cut, moved and assembled at the site. According to the ancient Greek historian Herodotus, construction took 20 years and needed 100,000 men to complete. More recent archaeological evidence points to the time period as about 200 years and that the number of men needed to build it was probably closer to 20,000. The middle pyramid at Giza was built for Khafre, Khufu’s son. Inside Khafre’s pyramid complex sits the Great Sphinx, a guardian statue that was the largest statue of its time measuring 240 feet long and 66 feet high. The Great Sphinx has the body of a lion and the head of a man and would come to be worshiped as the image of the god Horus. The last and southernmost pyramid was built for Menkaure, Kahfre’s son. As the smallest of the three pyramids at Giza, it stands 218 feet tall and is considered a precursor to the smaller pyramids that Egyptians constructed during the fifth and sixth dynasties. Since their original construction, the Great Pyramids of Giza have undergone some structural changes, specifically, tomb raiders who plundered the smooth white limestone coverings on the outside of them and an earthquake in the 14th century, which loosened many more. Because of this and probably weathering over a few thousand years, the Great Pyramid only stands 451 feet high today. What’s with the shiny limestone? Originally, the pyramids at Giza were covered with highly polished limestone. One estimate indicates that the original casings were so reflective that the casing acted like huge mirrors and the reflected light was so powerful that it would have been visible from the moon. In fact, the ancient Egyptians called the Great Pyramid “Ikhet” which means “Glorious Light.” The Great Pyramid of Cholula Although shorter than the Great Pyramid of Giza, the Great Pyramid of Cholula in Puebla, Mexico is actually the largest pyramid in the world, according to the Guinness Book of World Records. In fact, it’s also the largest monument ever constructed anywhere in the world, with a total estimated volume at over 4.45 million cubic meters (the Great Pyramid of Giza is about 2.5 million cubic meters). Built in four stages from the third century B.C. to the ninth century A.D., this pyramid was dedicated to the deity Quetzalcoatl. According to myth, it was built out of adobe bricks by a giant named Xelhua after he escaped a flood in the nearby Valley of Mexico. The construction of the Great Pyramid of Cholula more closely resembles a Step Pyramid than a smooth-sided one but the steps are much fewer and the depth of each step is considerably greater. Today, it appears more like a grassy hill with a church on top. Pyramid Snippetz
Issue 691 - 1/26/2015
THIS AIN'T NO RACKET, SNIPPETZ IS INVESTIGATING AL (SCARFACE) CAPONE by Lindsey Harrison “You can get much farther with a kind word and a gun than you can with a kind word alone.”
– Al Capone Even if you know nothing about American history, chances are good that you’ve heard the name Al Capone. He has been the subject of numerous books, articles and movies, including a segment from Mario Puzo’s “The Godfather.” The idea that a man essentially chose the gangster lifestyle rather than having been brought up in rough circumstances and being forced into it is only part of the draw Capone seems to have. He was incredibly successful and incredibly brutal. With such an interesting character in our country’s history, it would be irresponsible of us here at Snippetz not to investigate more about Capone. And so, we invite you to come with us as we take a look at the life of a man who chose to live on the wrong side of the law. Early life Al Capone was born Alphonse Gabriel Capone in Brooklyn, New York on January 17, 1899. His parents, Gabriele Capone and Teresina Raiola, were Italian immigrants who came to the United States in 1893. Al was one of nine children and by all accounts, seemed to be just a regular kid. Capone’s father was a barber and his mother was a seamstress. The family initially settled in a neighborhood by the Navy yard but soon, they moved into an apartment over Gabriele’s barbershop in a better part of town. Their new neighborhood was more culturally diverse, with people representing many different ethnicities. Capone attended Catholic school and many have theorized that the inadequate education he received, coupled with corporeal punishment delivered for a wide variety of offenses played a role in his later affinity towards violence. Although he was said to have been a decent student despite his school’s shortfalls, Capone was expelled when he was 14 for hitting a female teacher. Supposedly, she hit him first but whether or not that was true, Capone was thrown out and never went back. Following his expulsion, Capone made the acquaintance of a notorious gangster named Johnny Torrio. Movin’ on up Capone was involved in several small-time gangs, including the Junior Forty Thieves, the Bowery Boys, the Brooklyn Rippers, Torrio’s James Street Boys and then eventually the more powerful Five Points Gang. During this time, Capone met Mae Josephine Coughlin. The pair hit it off bigtime and in December, 1918, Mae gave birth to the couple’s son, Albert Francis Capone. Later that month, on December 30, the Al and Mae got married, after obtaining written consent from his parents since he was only 19 (the legal age to marry at the time was 21). About a year later, Torrio, who had moved to Chicago to help run the city’s massive brothel business, extended the invitation for the Capone family to move to Chicago. Al began his career there as a bouncer in a brothel. Rumor has it that either Capone or another gangster named Frankie Yale assassinated Torrio’s boss, Big Jim Colosimo in 1920. His death meant Torrio was the new kingpin. With the inception of Prohibition, Torrio got into the bootlegging business and acquired a vast fortune through his illegal business ventures. In 1925, Torrio handed over the reins to Capone who became the new boss of the Chicago crime syndicate, including gambling, prostitution and bootlegging. Capone maintained his status by systematically killing his rivals and other rival gangs. Torrio owned the Four Deuces in Chicago’s Levee area, which served as the headquarters for his rackets as well as a speakeasy, gambling joint and brothel. He recognized Capone’s abilities and after deciding that Al had enough experience under his belt, Torrio dubbed him a partner and Capone took over as the Four Deuces’ manager. Naughty, naughty When authorities began to crack down on racketeering in Chicago, Capone moved to Cicero. Capone liked to maintain an air of respectability and using his power, he managed to pull off the farce that he was just an ordinary businessman. But his temper gave him away. When his friend Jack Guzik was jumped by a small-time gangster wannabe, Capone tracked the man down and shot and killed him in a bar. Although the case seemed pretty cut and dry, a mysterious lack of witnesses kept Capone from having to answer for the murder. This behavior became commonplace, with Capone masterminding attacks against anyone who had done him or his friends wrong. He was behind two separate events that came to be known as The Adonis Club Massacre and the St. Valentine’s Day Massacre. Both incidents involved the killing of rival gangs or enemies of Capone or his friends. But as usual, a lack of evidence kept Capone from being charged. Capone’s undoing Even with his long history of violence, Capone’s arrest finally came because he hadn’t been paying taxes on his illegal businesses. He attended a “gangsters” conference in May 1929, after which he went out to see a movie. He was arrested for carrying a concealed weapon and separate charges were drawn up for his tax evasion and Prohibition violations. By this time, his empire was said to be worth, over $62,000,000. An investigation into Capone’s tax liability followed and the grand jury handed down an indictment with 22 counts of tax evasion, totaling more than $200,000. He and 68 other members of his gang were charged with 5,000 separate violations against the Volstead Act, formally known as the National Prohibition Act. On October 17, 1931, Capone was found guilty of several counts of tax evasion and sentenced to 11 years in prison, $50,000 in fines and $30,000 in court costs. Capone served his first stint in a prison in Atlanta but was transferred to Alcatraz in August, 1934. Due to a case of untreated syphilis, Capone’s health deteriorated quickly. He was released due to good behavior in November 1939. When he was released, his cognitive abilities were said to be that of a 12-year-old. Capone died on January 25, 1947. Capone Snippetz
Issue 692 - 2/2/2015
SNIPPETZ HAS GONE FISHIN'... FOR INFORMATION ABOUT PIRANHAS! by Lindsey Harrison “A reputation for a thousand years may depend upon the conduct of a single moment.” – Ernest Bramah, English writer Rumors are nasty things. No one knows that better than the piranha. Why, you ask? Well, because piranhas aren’t exactly the mindless killing machines they’ve been made out to be. How do we know this? Because we’re Snippetz and we know these things. Actually, it’s because we decided that there was something interesting about these misunderstood fish and we wanted to know more about them. We cast our lines into the water to see what information we could catch. And boy, did we find some great stuff! So take a journey with us into the world of the piranha and we’ll show you the truth about these amazing fish. General piranha information
Thanks, Teddy We mentioned before that piranhas have gotten a bad rap. They’ve been portrayed as dangerous and aggressive. Truth be told, most of the time they aren’t. Their reputation actually came from President Theodore Roosevelt. The story goes that in 1913, President Roosevelt went on a trip to Brazil. Wanting to impress the president with what piranhas are capable of, Brazilian scientist Miranda Ribeiro decided to block off a small section of a local river with nets. She then had that section stocked with thousands of pole-caught piranhas that had been starved for several days. When Roosevelt and his crew pulled up in their boat, they were warned not to stick their hands in the water or they risked an attack by the fish that lived there. Roosevelt wasn’t sold on the idea and he and his entourage asked for proof. Ribeiro had a sickly cow herded down to the riverbank and into the water. Within seconds, the piranhas swarmed the cow and devoured it. Stunned, Roosevelt concluded that piranhas were “the most ferocious fish in the world.” In 1914, in his book, “Through the Brazilian Wilderness,” he wrote, “Even the most formidable fish, the sharks or the barracudas, usually attack things smaller than themselves. But piranhas habitually attack things much larger than themselves . . . they will rend and devour alive any wounded man or beast; for blood in the water excites them to madness.” Of course, natives of the areas where piranhas are found know that piranhas aren’t as dangerous as they have been made out to be. Often, they will swim in the same bodies of water with the fish and emerge completely unharmed. Naturally, the occasional bite does occur but the frequency has been greatly exaggerated. However, if you decide to put your hand in an aquarium full of piranhas, there’s no guarantee that you won’t get nipped. So maybe don’t try that. But there are actually no documented reports of someone being killed in a piranha attack. What’s that sound? Did you know that some piranhas can bark? It’s true. The red-bellied species of piranhas make bark-like noises in a variety of situations. For instance, in a visual “staring contest” with another fish, they make quick calls that sound similar to barks. Scientists have interpreted this as a warning to the other fish not to mess with that piranha. If the piranha is actively circling or fighting another fish, it will emit low grunts or thud sounds, which researches believe more of a direct threat to that other fish. If the other fish is scared away by the first two vocalization types, the piranha will chase the other fish and gnash its teeth. The first two vocalizations are creating using the piranha’s swim bladder, which is a gas-containing organ that helps keep the fish afloat. By contracting and relaxing the muscles around the swim bladder, the piranha can make noises of different frequencies. Piranha Snippetz
Issue 693 - 2/9/2015
SIT DOWN WITH A COKE AND SNIPPETZ AS YOU TAKE A SIP OF SODA HISTORY by Lindsey Harrison “Coca-Cola is the only business in the world where no matter which country or town or village you are in, if someone asks, ‘What do you do?’ and you say you work for Coca-Cola, you never have to answer the question, ‘What is that?’”
– Muhtar Kent, American businessman When you are eating a restaurant and the server asks you what you would like to drink, how do you answer if you’d like a soft drink? In some places in the United States, you’d say, “A Coke, please.” With an answer like that, the server might assume that you’re asking for a Coca-Cola. But in some parts of the United States, the term “Coke” refers to all soft drinks. In those places, the server would ask you what kind of Coke you want and you’d have to be more specific so you actually get what you’re asking for. How incredible is that? One particular brand of soft drink is so pervasive, so popular, that in some places, it has come to mean every single type of soft drink available. How could Snippetz just ignore this soda juggernaut? It isn’t possible so we decided to search through the tables of history to find out all about Coca-Cola. Beginnings of a soft drink Colossus The brainchild that is Coca-Cola came about thanks to a pharmacist named John Pemberton in 1886. After fighting in the Civil War, Pemberton decided he wanted to create something that would bring him a modicum of commercial success. However, he had failed miserably in every other venture he undertook. Pemberton eventually moved to Atlanta, where he decided to take a crack at the budding soda-fountain beverage market. After finalizing its formula, Pemberton realized he had no idea how to get his product out to the customers. Enter Frank Robinson. He helped Pemberton obtain a patent for Coca-Cola’s formula, as well as designed the logo. Robinson also wrote the company’s first slogan: “The Pause That Refreshes.” Unfortunately, Coca-Cola didn’t do so well during its first year and Pemberton wouldn’t live to see it become a success because he died in August 1888. Candler to the rescue! Following Pemberton’s death, Asa Griggs Candler bought the business and breathed some life into the dying company in 1891. Using traveling salesmen who passed out coupons for a free Coke, Candler managed to get people to try the drink. He thought (probably correctly in most instances) that, once people tried it, they would love it and want to buy it. Additionally, Candler decided that he had to get the word out about his soft drink so he began putting the Coca-Cola logo on calendars, notebooks and bookmarks to reach as many people as possible. Undoubtedly, this contributed to Coca-Cola becoming a national brand, rather than a regional one, confined to Atlanta and the surrounding areas. Medicine or myth? Candler was actually the man responsible for touting Coca-Cola’s medicinal properties. Actually, it was the syrup he patented as medicine but people soon assumed the two were interchangeable and began to buy the syrup and the drink in order to get rid of fatigue headaches. Strangely enough, the syrup was also boasted as a “nerve tonic” that would calm people down when it was initially invented by Pemberton. However, in 1898, the U.S. congress passed a tax on all medicines and Candler immediately sought to have Coca-Cola be sold strictly as a beverage to avoid paying the tax. After a court battle, Candler won and his product was no longer sold as medicine. Cocaine or no cocaine? Even though the Coca-Cola Company denied it, the drink actually did contain a small amount of cocaine when it was first developed. In fact, the negligible amount of the drug remained in the soda’s formula until 1903. Candler said that he would shut the company down if he ever found out that the drink was doing harm to anyone and always maintained that, even though the drink used coca leaves, it was mixed with different kinds of alkaloids. The drug cocaine was a pure alkaloid, he said and it would take about 30 glasses to produce one dose of the cocaine drug. Candler’s worst career move . . . EVER Initially, Coca-Cola was sold out of soda fountains and there was no way to purchase a drink in a can or bottle like there is today. That’s where Ben Franklin Thomas and Joseph Whitehead come into play. In 1899, the pair approached Candler with the idea of bottle the drink. While Candler didn’t think bottling the drink was a worthwhile endeavor, he allowed Thomas and Whitehead to try, as long as they didn’t sacrifice quality. Candler drew up a contract that didn’t set a term on it so Thomas and Whitehead could basically have the rights to bottle the drink for as long as they wanted. They could also sell the rights to any bottling plants they created. Candler gave away the rights for nothing, nada, zip, zilch. Because Candler believed the only way to bottle a drink was with a Hutchinson stopper (a rubber stopper that was put in place by a wire and in order to open the bottle, the wire had to be pushed in or “popped”), he didn’t think the pair would be successful. He assumed that bottling method would cause serious problems for the quality of the drink. What he failed to foresee was that technology moves at the speed of light. By 1900, bottle caps had been invented and were beginning to gain in popularity. Thomas and Whitehead apparently knew there was a change on the horizon when they approached Candler, and ultimately got the perfect product at the perfect time for the perfect amount of money. The bottle cap method was wildly successful and by the early 1900s, bottled Coca-Cola was available at grocers and saloons. By 1909, 379 bottling plants had cropped up throughout the country. Santa and Coke Before artist Hadoon Sundbloom created the Santa Claus Coca-Cola ads in 1931, the jolly old elf was typically portrayed as an average-sized man in blue, green, yellow or red. But Sundbloom changed all that by portraying Santa as a plump, rosy-cheeked man with a long white beard in the advertisements he created. After the advent of Sundbloom’s ads, all depictions of Santa began to look that way. Coca-Cola Snippetz
Issue 694 - 2/16/2015
WHAT DID YOU SAY? SNIPPETZ SPELLS OUT SOME INTERESTING LANGUAGE FACTS by Lindsey Harrison “‘No’ is the second shortest word in the English language, but one of the hardest to say.” – Raymond Arroyo, American author Language is a critical part of human culture. Without it, communication becomes significantly more difficult. But language itself can be tricky, too. Think about it. The words “rough,” “cough” and “though” all end with “ough” but not one rhymes with any other. In fact, we pronounce “ough” in nine different ways. What’s that all about? Whose great idea was it to make thing so hard? The thing is, some things that don’t seem to make sense about language really do make sense in the grand scheme of things. Languages came about out of a need to communicate effectively so the rules and regulations placed on them have evolved over time. The ancient Greeks didn’t set out to confuse people when they created their alphabet. So even though there are things about words and language that seem a little strange, there’s a reason behind all of it. And, of course, Snippetz decided that, since we deal in words, it might be good to learn a little more about them and the languages they make up! When is the alphabet not just the alphabet? To start off, the word “alphabet” comes from the first two Greek letters, “alpha” and “beta.” Now, the alphabet is usually assumed to be just the list of letters from a language. But before the 1917 Communist Revolution in Russia, the letters actually had names. The names, when read in alphabetic order, composed a message. Although the exact meaning hasn’t been clearly determined, it goes a little something like this: “Knowing all these letters renders speech a virtue. Evil lives on Earth eternally, and each person must think of repentance, with speech and word making firm in their mind the faith in Christ and the Kingdom of God. Whisper [the letters] frequently to make them yours by this repetition in order to write and live according to laws of God.” Palindrome A palindrome is a word that is spelled the same both forwards and backwards. For example, racecar is a palindrome. But racecar is pretty tame when compared to the longest palindrome found in any language. That word happens to have a very specific meaning which is “door-to-door-salesman of lye for soap” in Finnish and looks like this: saippuakivikauppias. Where did THAT come from? Have you ever looked at a word and wondered, “Where the heck did that come from?” If not, here’s something to consider: the “&” sign is called an ampersand. It is a symbol formed when you mash the Latin letters “e” and “t” together. “Et” in Latin means “and” which is also what the ampersand symbol means. But why is it called an ampersand? That word is actually a mash-up of the phrase “and per se and,” which literally means that the character “&” by itself is the word “and.” “And per se and” became “ampersand.” You say “octopuses,” I say “octopi” Most people think that the plural form of “octopus” is “octopi.” But it’s actually not. You might be tempted to assume that, if “octopi” is wrong, then the correct word is “octopuses.” Academically, that’s the agreed-upon term, but technically speaking, they’re both wrong. “Octopus” is derived from the Greek term “eight legs” and so the pluralized form should follow the Greek rules, making it “octopodes.” Grammar can be confusing . . . If you’re the type of person who silently corrects other people’s grammar (and you know who you are), there are times when a sentence sounds funny but is actually grammatically correct. In fact, the following sentence is grammatically correct, as long as the correct punctuation is used: “In his essay, Chuck, where Alexis had had ‘had,’ had had ‘had had’; ‘had had’ had had a better effect on the teacher.” Likewise, the next sentence is also grammatically correct, no matter how strange it may sound. “It is true for all that that that that that that that refers to is not the same that that that that refers to.” A simpler way to understand might be to read it like this: “It is true for all that, that that ‘that’ which that ‘that’ refers to is not the same ‘that’ which that ‘that’ refers to.” Or maybe not . . . That’s a smooth blend . . . When a new word is formed by blending two words together, it’s called a “portmanteau word.” In recent times, they’ve become increasingly common. Two standard examples are “brunch,” which comes from “breakfast” and “lunch,” and “guesstimate,” which comes from “guess” and “estimate.” However, consider the term “ginormous.” It was added to the dictionary fairly recently, thanks to the movie, “Elf” and is a blend of “giant” and “enormous.” But here’s another fairly new term: “hangry.” It’s a blend of “hungry” and “angry” and is commonly used to describe someone who gets angry when they’re hungry. We bring this to your attention because, prior to the advent of “hangry,” there were only two words in the English language that ended in “gry.” If “hangry” gets added to the dictionary, that’ll make three! Say that again? When a name is tautological, it repeats itself. Many geographical places have names that do this. For instance, the Sahara Desert is technically the “Desert Desert” because “sahara” in Arabic is “desert.” Other similar names include Mount Fujiyama, which translates from Japanese as “Mount Fuji-mount.” Lake Tahoe actually means “Lake Lake” in Washo. The Mississippi River is “Big River River” in Algonquin. And what about Torpenhow Hill in west Englans? Well, “tor” and “pen” are both Celtic terms for “hill,” while “how” is an Anglo-Saxon term for “hill.” Thus, Torpenhow Hill literally means “Hillhillhill Hill.” Clearly, these tautological terms are abundant around the world. But here’s something to consider the next time you go to Serrano’s for a drink. When you order a chai tea, you’re actually ordering a “tea tea” since “chai” is the term for “tea” in Hindi. Perhaps, to avoid repeating yourself, you could simply order a chai. Language Snippetz
Issue 695 - 2/23/2015
"THAT'S ALL FOLKS!" SNIPPETZ PRESENTS THE LAST FILMS OF NOTEWORTHY MOVIE STARS by Lindsey Harrison “The end of a picture is always the end of a life.” – Sam Peckinpah, American director We all leave behind a legacy when we pass on. Some of us have lots of kids who have their own kids and our large family is our legacy. Others of us don’t have big families but our life’s work represents the legacy we leave behind. Consider famous sculptors, painters, musicians or writers. The same can be said for actors and actresses. Their craft of making movies and television shows is what solidifies their place in our minds long after they’re gone. Sometimes, we lose someone so iconic and talented that their passing almost defines their celebrity status as much as their performances do. Other times, we lose someone that has sort of always been a staple in lives through our favorite television show or movie and it feels like we’ve lost a member of the family. We at Snippetz know that, in many instances, these stars left us all too soon. So we decided to look back at the last movies they made, good or bad, to see if they went out with a whisper or a bang. James Dean When you think of an iconic star that passed away at the height of his or her career, James Dean is likely to come to mind first and foremost. At 24 years old, Dean’s career had barely begun. He had already received great reviews for his starring roles in “East of Eden” and “Rebel Without a Cause.” But his life was cut short when, on Sept. 20, 1955, while on his way to an auto race in Salinas, California, Dean was involved in an accident and tragically died due to his injuries. He had finished filming his last movie, “Giant,” not long before the crash. The film featured a young ranch hand who struck it rich and became an oil tycoon. Over the course of the movie, Dean’s character aged about 30 years so he had his hair dyed gray to give the film some realism. But his hair was still gray when he died, which aged him in an unnerving way. “Giant” was a hit and Dean received Oscar nominations for Best Actor in both “Giant” and “East of Eden,” making him the first person in history to receive a posthumous nomination from the Academy. Marilyn Monroe Notorious for her birthday serenade of President John F. Kennedy, as well as the iconic picture of her holding her dress down as air from a vent in the sidewalk blows it sky-high, Marilyn Monroe was yet another film star who met an untimely death. The last film she actually completed was “The Misfits,” released in 1961. Ironically, the creation of the film (and the drama therein) is often as memorable as the film itself. Considering that it was written by Monroe’s third husband, Arthur Miller, you could easily guess that there might be some tension, especially since the couple’s marriage was already headed towards a divorce. Tag on the fact that Monroe was said to be chronically late to everything and struggling with a drug addiction, it’s clear to see that the behind-the-scenes dysfunction added to the notoriety of the film. Sadly, Monroe died from a barbiturate overdose on Aug. 5, 1962 at the age of 36. Katharine Hepburn Unlike our previous two stars, Katharine Hepburn didn’t die suddenly in a tragic accident. Her career was long and successful, earning six of her movies spots on the American Film Institute’s list of the Top 100 United States Love Stories, more than any other actress who appears on that list. Hepburn filmed her final movie, “Love Affair,” in 1994. The film told the story of a blossoming romance between real-life couple Warren Beatty and Annette Benning, who also starred in the film. With four Oscars under her belt, Hepburn retired in the mid-1990s and died at home in 2003, at the ripe old age of 96. John Belushi Another star whose career was cut short is John Belushi, who found his start in show business on the television show, “Saturday Night Live,” in 1979. Many remember him as a Blues Brother, playing opposite Dan Aykroyd and securing a spot in the SNL hall of fame. Beyond that, Belushi had a successful, if short-lived acting career. He starred in the movies “1941” and “The Blues Brothers,” also with Aykroyd. Belushi’s last film, “Neighbors,” was a dark comedy about suburban life and also co-starred Aykroyd. It was released in 1981 and just three months later, on Mar.5, 1982, Belushi met an unfortunate end due to an overdose of cocaine and heroin. He was 33 years old. Audrey Hepburn Audrey Hepburn was extremely talented and was able to translate a career as a ballerina and model into one of an award-winning actress. In her first major role as Princess Ann in 1953’s “Roman Holiday,” Hepburn won the Oscar for Best Actress. Probably most memorable were her roles in “Breakfast at Tiffany’s” and “My Fair Lady,” but she didn’t win any awards for those films. Hepburn did earn the distinction as one of only a few performers to win a Tony, an Emmy, an Oscar, and a Grammy Award. Her last film was “Always,” which was released in 1989. She died of cancer four years later at the age of 63. Chris Farley Chris Farley found his claim to fame in the type-cast role of the overweight, awkward misfit who secured laughs with his physical comedy. Just as his idol Belushi had done, Farley’s big break came on the set of SNL. He cultivated his comedic talents into a string of successful movies including 1995’s “Tommy Boy.” Throughout his career, however, Farley battled addictions to drugs and alcohol, which took its toll on his working life. Filming of his last movie, 1998’s “Almost Heroes,” supposedly had to be put on hold several times while Farley attended rehab. Farley died on Dec. 18, 1997, shortly after completing the movie. Sadly, Farley died just as his idol had, due to an overdose of cocaine and heroin at the age of 33. Underwhelming last films of noteworthy stars Not every movie a famous actor or actress makes can be a blockbuster. To be honest, some are complete flops, regardless of the celebrities who are involved in its creation. Here are a few of the less-successful last movies of some major stars. Don’t worry if you don’t recognize most of the movies on the list; we didn’t either!
Issue 696 - 3/2/2015
SNIPPETZ TAKES A TRIP INTO THE SURREAL WORLD OF SALVADOR DALI by Lindsey Harrison “Surrealism is destructive, but it destroys only what it considers to be shackles limiting our vision.” – Salvador Dalí While the name might not be very familiar, Salvador Dalí’s artwork probably is. He’s the man behind the pictures of the melting clocks. But his incredible talent and imagination didn’t stop there. Dalí created hundreds of interesting and sometimes disturbing images that helped define the surrealist movement. But it wasn’t just his artwork that was unique and a little bit strange. Dalí himself was a quirky fellow, to say the least. With a black handlebar mustache that he kept waxed to shiny perfection, Dalí was known for his wide-eyed expression that he routinely sported when being photographed. So where did he get the inspiration for his paintings? Snippetz decided that it was high time we stepped into Dalí’s world to share all the wonder and intrigue that this artist infused into his works and his life. Early life Salvador Domingo Felipe Jacinto Dalí i Domènech was born on May 11, 1904, in Figueres, Spain. His father, Salvador Dalí i Cusí was a lawyer and notary whose practical side didn’t mesh well with his son’s more eccentric personality. Dalí’s mother, Felipa Domènech Ferrés, however, was much more lenient and let her son grow into the unique person he was to become. One stand-out experience that Dalí claimed help shape his later life was a trip he took with his parents to the grave of his older brother, also name Salvador, who had died as a toddler, just nine months before Dalí was born. His parents told him that he was the reincarnation of his brother and Dalí took that to heart, later on saying that he and his late brother “resembled each other like two drops of water, but we had different reflections,” and “[he] was probably a first version of myself, but conceived too much in the absolute.” At an early age, it became clear to Dalí’s parents that their son had an extreme talent in art. They sent him to drawing school to hone his craft. In 1917, Dalí’s father organized an exhibition of his son’s charcoal drawings in their family home. In 1919, while just a teenager, Dalí held his first public exhibition at the Municipal Theater in his hometown. From real to surreal In 1922, Dalí enrolled in the San Fernando Academy of Fine Arts in Madrid. There, he was also introduced to several other artistic styles, like Metaphysics and Cubism. His talent made him a standout, as did his eccentric and bizarre ways. But his talent couldn’t save him from his lack of humility. About one year after he enrolled in the Academy, he was suspended for criticizing his teachers and publicly questioning the academy’s choice in professors. In 1926, Dalí reentered the academy but was permanently expelled when he claimed that the professors were idiots and that he was far more qualified than they were. Over the next three years, Dalí traveled to Paris several times and met up with several influential painters, such as Pablo Picasso and René Magritte, that latter of whom introduced Dalí to surrealism. Up until then, Dalí had focused on Impressionism, Futurism and Cubism. Surrealism became a part of his repertoire and he became associated with three general themes in his artwork: man’s universe and sensations; sexual symbolism; and ideographic imagery. In 1929, Dalí entered his first Surrealistic period in which he created small oil paintings consisting of elements from his dreams put together into a collage. His major contribution to the Surrealist movement of that time was what he called the “paranoiac-critical method.” He described this as a mental exercise of accessing the subconscious as a way to enhance his artistic creativity. An Andalusian dog and a Russian immigrant That same year, Dalí expanded his artistic endeavors into the world of film-making. He teamed up with Luis Buñuel on two film projects, called “Un Chien andalou (An Andalusian Dog)” and “L’Age d’or (The Golden Age).” Dalí also met the only woman he would ever claim to love romantically, a Russian immigrant 10 years his senior, named Elena Dmitrievna Diakonova. Diakonova wasn’t exactly available at the time, though. She was actually married to Surrealist writer Paul Éluard, one of Dalí’s first influences into surrealistic writing. Dalí fell hard for Diakonova (affectionately called Gala), and pursued her relentlessly. To get her attention, he was said to have covered himself in goat dung, waxed his armpit hair and dye the underlying skin blue and wore flowers on his head. Gala couldn’t resist his attempts and soon left her husband to marry Dalí in a civil ceremony in 1934. What about the melting clocks? About two years after his introduction to the Surrealist movement, Dalí painted one of his most memorable and identifiable pieces: “The Persistence of Memory.” The painting shows melting pocket watches flowing through a landscape setting. Experts seem to agree that Dalí’s intention with the piece was to illustrate how he felt that time was not rigid and as easily destructible as anything else. The personality outshines the artwork Not to say that Dalí’s artwork wasn’t as spectacular as it always had been, but his flamboyant style, kooky habits and colorful personality brought him increasing notoriety through the mid-to-late 1930s. His long, black handlebar mustache was just one element that lent itself to the overall persona that was Salvador Dalí. He would sometimes show up to public events wearing a ridiculously long cape or carrying a walking stick. Another time, at the 1934 New York exhibition of his works and the so-called social “Dalí Ball,” he showed up wearing a bra enclosed in a glass case. At the London International Surrealist Exhibition, he gave his lecture wearing a full deep-sea diving suit. Clash of ideologies At the onset of the Spanish Civil War in 1936, Dalí found himself in the minority as one of the few members of Spain’s cultural elite to support fascism after the Nationalist victory in 1939. The members of the Surrealist movement were closely connected to the French Communist Party and with that clash of ideals, they decided to expel Dalí from their movement. In fact, many of his former compatriots began issuing harsh statements against him, up until his death and sometimes even beyond. While Dalí continued to paint throughout the remainder of his life, he dabbled in other mediums as well. In 1960, he began work on his Theater and Museum in Figueres, which represented his single largest project and consumed most of his time through 1974, when it opened. Final years In 1980, Dalí’s health had taken a turn for the worse. He noticed that his right hand began trembling, as though he had developed Parkinson’s disease. Supposedly, Gala, then well into her 80s, had been dosing him with a cocktail of dangerous, unprescribed medications that ultimately and irreversibly damaged his nervous system. His artistic capacity never recovered. Following Gala’s death in 1982, Dalí seemed to have given up on life. He purposefully dehydrated himself, which some claim was a suicide attempt. In November 1988, Dalí was admitted to the hospital with heart failure. On January 23, 1989, Dalí died at the age of 84. Dalí’s legacy remains in the over 1,500 paintings her created, as well as collaborations with Alfred Hitchcock, Coco Chanel, and Christian Dior (to name a few). And if his artwork doesn’t stick with you, perhaps his wide-eyed stare and overly-waxed jet black mustache will. Issue 697 - 3/9/2015
IT'S HARD TO SAY GOODBYE: SNIPPETZ UNEARTHS PARTING RITUALS by Lindsey Harrison “He who is completely sanctified, or cleansed from all sin, and dies in this state, is fit for glory.” – Adam Clarke, British theologian Death induces all kinds of behavior in people. We mourn the loss of the ones we’ve loved and often celebrate the lives they led. There’s no right or wrong way to handle the passing of some you’re close to. For as long as people have been on the earth, we’ve found ways to handle death. Some rituals make sense on a nearly global level, like gathering with family and friends to garner strength and comfort during our time of need. Other rituals are very specific to one culture or another. Snippetz decided to examine some of the more obscure “death rituals,” to find out what exactly goes on when someone dies. More importantly, we wanted to know why these cultures do what they do. So come along with us as we journey around the world to investigate death rituals Sin eating Most people have heard of certain religions that believe if you confess your sins to God before you die and ask to be forgiven, you will be granted access to Heaven. In a way, that is a death ritual. Sin eating is sort of the same thing, but it takes the idea of cleansing oneself of sin the next level. While the ritual of sin eating can vary slightly, depending on the region, the tradition holds that sin eating takes place either at the bedside of someone who is dying or at the funeral of someone who has already passed away. The sin eater, who is usually paid a small fee (honestly, would you eat someone else’s sins for free?), sits beside the body on a low stool. Depending on the regional tradition, a loaf of bread and bowl of beer are either passed over the body or placed directly on top of it. The belief is that the food would absorb the deceased’s guilt and sin, cleansing them so they can pass on to the next life. The sin eater would then eat and drink, and pronounce that the person was free from any sins by giving a speech that goes something like this: “I give easement and rest now to thee, dear man. Come not down the lanes or in our meadows. And for thy peace I pawn my own soul. Amen.” Once all that was complete, family members supposedly burned the plate and bowl that held the ritualistic meal once the sin eater had left. As you might assume, sin eating wasn’t exactly a high-paying career, nor was it something that brought great esteem to sin eaters or their families. Often, they were social outcasts who lived alone in a secluded area of the village and had little to no contact with members of the community outside of their sin eating work. In some instances, they were avoided as completely as possible due to their widely-believed association with evil spirits and unholy practices. Sin eaters were often believed to be doomed to spend the afterlife in hell because they ingested the sins of others and carried those around with them. In fact, the Roman Catholic Church supposedly excommunicated any and all sin eaters. References to the ritual of sin eating have been traced back to early Egyptian and Greek civilizations and have been documented as recently as 1906, when the “last sin eater,” Richard Munslow of Shropshire, England, died. The tradition can also be found in the Aztec culture, in the form of Tlazolteotl, the goddess of earth, motherhood and fertility. She is said to cleanse the souls of people confessing their sins to her on their deathbeds by “eating its filth.” The “turning of the bones” Celebrating the life of a loved one that has passed is not uncommon. Mostly, it’s done through feasts and ceremonies. But in some cultures, like the Merina people of the highlands of Madagascar, the party doesn’t stop once the deceased has been buried. In fact, every seven years, the Merina dig up the bodies of their dead relatives in the “famadihana” or “turning of the bones” ritual and change the person’s clothes. It’s meant to be a ceremony of happiness and large amounts of money are often spent on the celebration. At the height of the festivities, the Merina people gather around their deceased relatives and take pictures with them. Often, they steal a small memento, like a piece of clothing, from their dead relative, to keep under their mattress in an effort to thwart infertility. Finally, the bodies are reburied, dressed in new duds and surrounded by money and alcohol to accompany them to the afterlife. Suttee This traditional Hindu ritual was practiced in India for many years before the occupying British outlawed it in 1829. Suttee is when a grieving widow voluntarily lies next to her husband on his funeral pyre, where she is burned alive next to her late spouse. Because widows in India were extremely low on the social ladder, considered impure, shunned and abhorred by society, the widows often thought it to be far more preferable to die beside their husbands where they could be reunited after death. Self-mummification We’ve all heard about mummification in ancient Egyptian culture. But what about self-mummification? This ritual was practiced until at least the late 1800s in Japan and it’s just as it sounds. But the preparations for this ritual aren’t easy, especially since the people undertaking the task were very much alive throughout the process. We’ll spare you the details but let’s just say the preparations for self-mummification took over 2,000 to be completed properly. The purpose was to achieve enlightenment by separating oneself as far from the physical world as possible so that at death, instead of being reborn, you became one with Buddha instead, truly a great honor in Japanese culture. Off with their fingers! The Dani tribe of Papua New Guinea had a truly horrifying way of honoring their dead. Unfortunately, it was the women and girls of the tribe that suffered the most from the ritual. Whenever a family member died, the females of that family would have one of their fingers cut off. Of course, the Dani had a primitive way of numbing the hand and fingers to supposedly lessen the pain but the outcome was still the same. Why would they do this? Well, it was believed that doing so showed honor to the deceased relative. In fact, the amputated digit was ritualistically burned and then the ashes were stored in a special place to be revered for years to come. Naturally, the practice has been outlawed for quite some time, but there are still women of the tribe who are living proof of the ghastly ritual. Professional mourners While many cultures preach the quality of “keeping your cool” when something awful happens, like the death of a loved one, those same cultures often believed that the dead could (and would) exact revenge if they weren’t sufficiently mourned at their funerals. How did these cultures, such as the Taiwanese, overcome the obstacle? They hired professional mourners. These mourners were paid to make a huge scene, bawl their eyes out, pull on their hair, beat their chests, throw themselves on the ground, anything to show the devastation they supposedly felt. Everyone knows that it’s all for show, but the family gets to show their sorrow and pain while saving face by not having to put on such histrionics themselves. Pretty clever, huh? Issue 698 - 3/16/2015
TYPE 1 OR TYPE 2? SNIPPETZ INVESTIGATES THE DIABETES DIAGNOSIS by Lindsey Harrison “The groundwork of all happiness is health.” – Leigh Hunt, English poet According to the Centers for Disease Control and Prevention’s 2014 National Diabetes Statistics Report, an estimated 29 million people in the United States have diabetes. Of those 29 million, 1.7 million people aged 20 years or older were newly diagnosed in 2012. As if those numbers aren’t disturbing enough, another 86 million adults in the U.S. – more than one in three – have prediabetes. It’s clear that the number of diagnoses of the disease is on the rise. So what, if anything, can we do to prevent it? You’d probably agree that the key to prevention is to truly understand what diabetes is and what causes it. Snippetz knows your health is important to you (it’s important to us, too) so we decided to do the research to help educate you about diabetes. WHAT IS DIABETES AND PREDIABETES? Diabetes – technically termed diabetes mellitus – is a group of metabolic diseases that cause a person to have high blood sugar (blood glucose). High blood sugar can result from an inadequate production of insulin or because the body’s cells do not respond properly to insulin, and sometimes it’s a combination of both. Prediabetes refers to a blood glucose level that is above normal but not high enough to be classified as diabetes. There are three main types of diabetes: type 1, type 2, and gestational diabetes. Type 1 diabetes, formerly called insulin-dependent or juvenile-onset diabetes mellitus, accounts for about 5 percent of all diagnosed diabetes cases. It develops when the cells that produce the hormone insulin (called the beta cells) in the pancreas are destroyed. That destruction is initiated by the body’s immune system and either limits or completely eliminates the production and secretion of insulin. In order to maintain appropriate levels of blood glucose, people with type 1 diabetes must increase the insulin in their bodies by injecting it or through the use of an insulin pump. Type 2 diabetes, formerly called non-insulin-dependent or adult-onset diabetes mellitus, accounts for about 90 to 95 percent of all diagnosed diabetes cases. This type usually develops due to an insulin resistance, a disorder in which the cells within the muscles, liver, and fatty tissue don’t metabolize insulin properly. This results in an increased need for insulin and the beta cells in the pancreas that produce insulin can’t keep up with the higher demand. In some people, a resistance to insulin plays a larger role in the development of diabetes than the lack of the beta cells to produce sufficient insulin, and for others, it’s just the opposite. Gestational diabetes is a form of glucose intolerance that some pregnant women develop during the second or third trimester. Elevated blood sugar levels can pose a danger to both the mother and the baby. After giving birth, about 5 to 10 percent of women will continue to experience elevated blood glucose levels, which usually results in a diagnosis of type 2 diabetes. WHY IS IT CALLED DIABETES MELLITUS? During the second century A.D., Aretus the Cappadocian, a Greek physician, noted that people with the disease experienced increased urination, called polyuria today. He named the ailment “diabetes,” which means “siphon” in Greek. In 1675, the word “mellitus” was added to the condition’s name by Thomas Willis. He noted that the blood and urine of people with diabetes contained excess glucose, which he deduced would make it sweet like honey. In Latin, “mel” means “honey” so it fit nicely with what he had observed. Willis wasn’t the first person to notice the “sweet” characteristic of a diabetic’s blood and urine. In fact, the ancient Chinese people noticed that ants were attracted to some people’s urine because it was sweet. Although they hadn’t fully identified what the disease was, they did give the phenomenon a name: “sweet urine disease.” WHO IS AT RISK FOR DEVELOPING DIABETES? There are several known risk factors that could lead to the development of type 1 diabetes including the following: Anyone with a relative who has been diagnosed with type 1 diabetes has a slightly increased risk. The presence of a certain gene can indicate an increased risk. People who live further away from the equator tend to have a higher risk than people living closer to it. Type 1 diabetes can appear at any age but there are two peaks at when it tends to occur more: between 4 and 7 years old, and between 10 and 14 years old. There is an increased risk of developing type 2 diabetes for people who are overweight or obese. Coupled with physical inactivity and eating the wrong types of foods also increases that risk. In fact, drinking one can of non-diet soda per day can increase that risk but 22 percent, according to one study. Additionally, the risk increases as people get older and for men who testosterone levels are low. Other risk factors include smoking, depression and stress, and chronic lack of sleep. Just as it sounds, gestational diabetes only occurs during pregnancy. Scientists have found that women whose diets before getting pregnant were high in animal fat and cholesterol were at a higher risk than women whose diets were lower in those two elements. HOW CAN DIABETES BE PREVENTED? There is no known way to prevent type 1 diabetes, although several clinical trials are currently underway with subsequent studies planned. The Diabetes Prevention Program is a major research study conducted by the CDC and has shown that, for people at high risk of developing diabetes, lifestyle interventions (like a change in diet) that lead to weight loss and increased physical activity prevented or delayed type 2 diabetes. In some instances, blood glucose levels of those individuals at a higher risk were returned to within the normal range. FACT OR FICTION? Rumors about diabetes are common and it’s easy to get confused about what’s fact and what’s fiction. Here’s a bit of clarity into some of those rumors.
Issue 699 - 3/23/2015
EIGHT LEGS IS MORE THAN ENOUGH! SNIPPETZ SPINS A TALE OF SPIDERS by Lindsey Harrison “People who say, ‘There’s nothing to fear from spiders,’ have clearly never been to Australia.”
– Cate Blanchett, Australian actress Long before the movie “Arachnophobia” was released in 1990, people have had an almost unnatural fear of spiders. Notice how we said “almost.” The truth is that spiders can be pretty scary. Maybe not enough to make a person scream and run away or throw things at it in an attempt to squish it before it has a chance to skuttle across the floor and walk its creepy legs all over you. But that all depends on who you are. Regardless of how rational or irrational the fear of spiders may be, they are pretty unique and interesting. So Snippetz has decided to set aside our fears of these eight-legged monsters, uh, creatures and untangle the web of misconceptions and misinformation about spiders. INSECTS VERSUS SPIDERS You’re probably thinking, “I know the difference between insects and spiders.” But if you thought long and hard about it, other than spiders having eight legs and not six, it might be difficult to actually pick out the things that make one distinct from the other. Spiders are arthropods, just as insects are but anatomically, they are different from insects because they have two body segments instead of three. Their body segments, the cephalothorax and the abdomen, are joined by a small, flexible cylindrical body part called a pedicel. Additionally, spiders do not have antennae like insects do. They also have the most centralized nervous system out of all arthropods. Most arthropods have muscles that they use to help extend their limbs to move around. Spiders don’t. Instead, they actually use hydraulic pressure to extend their limbs. Most people know that spiders have spinnerets that extend from the end of their abdomen. These appendages help spin the wide variety of webs that different species of spiders can create by extracting and placing silk from up to six different kinds of glands located in their abdomen. WHAT’S ON THE MENU? Most spiders are carnivorous, preferring to feast on insects and other spiders. Some of the larger species even target birds and lizards. Because spiders’ digestive tracts aren’t large enough for them to ingest food in solid form, they inject digestive enzymes into their victims which liquefies the insides, which spiders then suck up like a disgusting milk shake. Typically, spiders make great use of their webs by trapping their prey in them. Others will use a sticky lasso to shoot at their prey and reel it in. Still other species actually run down their prey, which is why stomping on it while emitting a high pitch scream may not be a huge overreaction. There is one species of spider, the Bagheera kiplingi, that was identified in 2008 as actually being herbivorous, meaning they eat plants rather than insects, lizards, birds or humans . . . uh, other spiders. We meant other spiders. HUNTING 101 IN THE EYE OF THE BEHOLDER Most spiders have four sets of eyes, arranged in certain patterns depending on the species. Even with all those eyes, spiders aren’t able to see things that are far away; they’re near-sighted. Even without the corrective lenses spiders would need to improve their near-sightedness, some species are able to see things humans can’t. In fact, some have been shown to be able to see in both the UVA and UVB light spectrums. THE ORIGINAL WEB-SLINGERS While Spider-Man might be advertised as a web-slinging crime fighter, we all know who the original web-slingers really are. And those webs are just as impressive in real life as they are in the Spider-Man movies. In fact, for its weight, spider web silk is stronger and tougher than steel. Spider silk is mostly comprised of protein that is similar to insect silk, such as the silk from silkworms. It starts out as a liquid and hardens as the spider pulls it out of its body. This action actually changes the internal structure of the silk’s proteins. What makes spider silk unique is the juxtaposition between its elasticity and its strength. Spider silk is much more elastic than other similar biological materials, like chitin (a derivative of glucose found in insect exoskeletons, among other things), collagen and cellulose. It can stretch much further without losing its shape, making it ideal for trapping prey. THIS BITES! We all know that spider bites can be nasty things. However, only a few species are actually dangerous to humans. Many people believe that tarantulas are something to be feared, probably based on their size and the size of their fangs, but in fact, if they were to bite you, it would only cause a mild irritation. Their venom is not strong enough to kill a human. Recluse and widow spiders, on the other hand, are highly venomous and a single bite can be fatal and is always a serious medical concern. Similar to a tarantula, the funnel web spider will often display its fangs as a defensive mechanism, but if it were to bite you, it probably wouldn’t inject enough to be fatal. Over the last 50 years, only 13 human deaths have been recorded due to funnel web spider bites. Even with that small number, until recently, humans still considered the funnel web spider to be the world’s most dangerous spider, based on the toxicity of their venom. But the Brazilian wandering spider overtook that designation based on the higher frequency of human encounters. So if a tarantula isn’t as scary as it seems, what is there to be afraid of? Well, tarantulas, besides being huge and scary-looking with massive fangs, also have tiny hairs all over their bodies. Why is that so worrisome? Tarantulas can actually fling these hairs to deter potential predators, porcupine-style. DANCE, DANCE It may sound strange, but spiders have inspired a type of dance. During the 16th and 17th centuries, it was believed that a person could avoid the lethality of a tarantula bite if they engaged in a harried, frenetic type of dance to a specific type of music. That music became known as a tarantella. Even Pyotr Ilyich Tchaikovsky included a tarantella in one of his Pas de Deux in “The Nutcracker.” SPIDER SNIPPETZ
|
Vertical Divider
|