History Podcasts

Five Things We Learned About Our Human Origins in 2018

Five Things We Learned About Our Human Origins in 2018


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

The question of what makes us human is one that is fascinating to most of us, and for many the answer lies in looking back to our roots as a species. 2018 was a fantastic year for learning new and exciting things about our origins, with ground-breaking discoveries and pioneering technology changing a number of perspectives and bringing new possibilities to light. From hybrid humans to signs of human behavior 300,000 years ago, these are some of the most significant stories from 2018.

Evidence of Modern Human Behavior in Ancient Africa

A number of finds at the Olorgesailie Basin in Kenya have proven that humans were exhibiting modern behavior as early as 300,000 years ago. Numerous obsidian artifacts were found at the site in 2018, providing strong evidence of trade networks, as obsidian is not native to Olorgesailie and must have been sourced from at least 55 miles (88.51 km) away.

The first evidence of human life in the Olorgesailie Basin comes from about 1.2 million years ago. The sophisticated tools (right) were carefully crafted and more specialized than the large, all-purpose handaxes (left). ( Human Origins Program, Smithsonian )

Along with the obsidian tools were finds of manganese dioxide and ochre, which showed signs they were processed to use their pigments.

  • 13,000-Year-Old Footprints Found in British Columbia Are the Oldest in North America
  • New Insights into Rapid Advance in Human Innovative Thinking
  • Ancient Anomalous Human Skeletons: Humanity Could be Much Older Than We Think

This is particularly significant as it pushes the evidence of modern behavior in humans back tens of thousands of years before it was previously thought, aligning with the oldest known fossil remains of a modern human.

Neanderthals May Have Pioneered Cave Art

One traditional answer to the question ‘what makes us human?’ has been our ability to think symbolically and create art, but in 2018 scientists revealed the origins of some cave art in Spain was actually Neanderthal. The discovery adds more evidence to the theory that Neanderthals and modern humans were not as different to one another as previously assumed.

Panel 78 in La Pasiega. The scalariform (ladder shape) composed of red horizontal and vertical lines dates to older than 64,000 years and was made by Neanderthals. ( C.D Standish, A.W.G. Pike and D.L. Hoffmann )

It was revealed that the art must have been created by Neanderthals after an international team of scientists dated the calcite (crystal) layer which had formed on top of the ancient artwork. As the calcite had formed over the art, the art must have been there beforehand and has to be older than it. Results revealed the artwork predated the arrival of modern humans in the region by a minimum of 20,000 years.

Humans Migrated ‘Out of Africa’ a Lot Earlier than Previously Thought

It is now known that modern humans evolved in Africa around 300,000 years ago before migrating to other continents. In January 2018 a group of archaeologists from Tel Aviv University working at Mount Carmel, Israel discovered the upper jaw bone of a Homo sapiens in a layer of sediment with tools previously attributed to Neanderthals, pushing back the date for human migration out of Africa by about 40,000 years. This also confirms the theory that there was more than one expansion phase with different groups leaving Africa over a vast time period.

This is the left hemi-maxilla with teeth. ( Rolf Quam )

The implications of this discovery are huge, as it suggests humans were behaviorally modern enough to communicate and organize migratory expeditions significantly earlier than was formerly assumed.

Oldest Modern Human Footprints in the Great White North

A group of modern humans left their mark on the Canadian island of Calvert, British Columbia approximately 13,000 years ago in the form of footprints. A team from the University of Victoria including representatives from the Heiltsuk and Wuikunuxv First Nations revealed the group of 29 footprints, which were made by at least three people, and are the oldest known footprints in North America . The footprints are a very rare find, as there are not many sites with footprints worldwide. One of the sets of prints was left by a child, and they seem to have been walking barefoot.

Photograph of track #17 beside digitally-enhanced image of same feature using the DStretch plugin for Image. ( Duncan McLaren )

Despite Calvert being a tiny island today, the site may have been part of a route taken by humans when they migrated between Asia and the Americas during the late Pleistocene era.

First Confirmed Hominin Hybrid

The biggest archaeological discovery of 2018 probably came out of the already significant Denisova cave in Siberia. Ten years after the discovery of a new class of hominin ( Homo Denisova ) at the site, a small fragment of bone has yielded stunning results after being positively identified as the direct offspring of a Neanderthal and a Denisovan . The female offspring, nicknamed ‘Denny’, had survived to approximately 13 years of age, meaning she was way beyond infancy and may have had children of her own.

Drawing of a Neandertal mother and a Denisovan father with their child, a girl, at Denisova Cave in Russia. ( Petra Korlević )

The discovery was made by a team led by Viviane Slon and Svante Pääbo from the Max Planck Institute for Evolutionary Anthropology in Germany. The team began by looking at Denny’s Mitochondrial DNA (passed to her by her mother) which showed it was Neanderthal in origin. However when they sequenced her nuclear genome and compared it to the genomes of Neanderthals and Denisovans from the cave, and a modern human with no Neanderthal ancestry, they found that about 40% of her DNA fragments were Denisovan in origin. As her Mitochondrial DNA was already confirmed to be Neanderthal, the conclusion was that Denny must have had a Neanderthal mother and a Denisovan father, though Slon and Pääbo were careful not to use the word “hybrid” in their paper as debate about exact classification of our closest relatives is still ongoing.

  • The Origin of ‘Us’: What We Know So Far About Where We Humans Come From
  • Ancient Human Fossil Finger Discovery Points to Earlier Eurasian Migration
  • Research Confirms that Neanderthal DNA Makes Up About 20% of the Modern Human Genome

There is certainly more to come on this in 2019 with the latest development already coming from an artificial intelligence genome study published in Nature Communications which has identified what looks like a Neanderthal-Denisovan hybrid species in the genome of Asian individuals.

It’s a good start to another year of getting closer to understanding the ancient origins of humankind.


Five Things We Learned About Our Human Origins in 2018 - History

The U.S. Department of Health and Human Services (HHS) is the nation's principal agency for protecting the health of all Americans and providing essential human services.

Below is a list of major events in HHS history and a list of the Secretaries of HHS/HEW.

The Affordable Care Act was signed into law, putting in place comprehensive U.S. health insurance reforms.

The Medicare Prescription Drug Improvement and Modernization Act of 2003 was enacted - the most significant expansion of Medicare since its enactment. It included a prescription drug benefit.

The Office of Public Health Emergency Preparedness (now the Office of the Assistant Secretary for Preparedness and Response) was created to coordinate efforts against bioterrorism and other emergency health threats.

The Centers for Medicare & Medicaid was created, replacing the Health Care Financing Administration.HHS responds to the nation's first bioterrorism attack - delivery of anthrax through the mail.

Publication of human genome sequencing.

The Ticket to Work and Work Incentives Improvement Act of 1999 was signed, making it possible for millions of Americans with disabilities to join the workforce without fear of losing their Medicaid and Medicare coverage. It also modernized the employment services system for people with disabilities.

Initiative to combat bioterrorism was launched.

The State Children's Health Insurance Program (SCHIP) was created, enabling states to extend health coverage to more uninsured children.

Welfare reform under the Personal Responsibility and Work Opportunity Reconciliation Act was enacted.

The Health Insurance Portability and Accountability Act (HIPAA) was enacted.

The Social Security Administration became an independent agency.

Vaccines for Children Program was established, providing free immunizations to all children in low-income families.

Human Genome Project was established.

Nutrition Labeling and Education Act was passed, authorizing the food label.

Ryan White Comprehensive AIDS Resource Emergency (CARE) Act began providing support for people with HIV/AIDS

The Agency for Health Care Policy and Research (now the Agency for Healthcare Research and Quality) was created.

JOBS program and federal support for child care was created.

McKinney Act was passed to provide health care to the homeless.

National Organ Transplantation Act was signed into law.

Identification of AIDS - In 1984, the HIV virus was identified by the Public Health Service and French scientists. In 1985, a blood test to detect HIV was licensed.

Federal funding was provided to states for foster care and adoption assistance.

The Department of Education Organization Act was signed into law, providing for a separate Department of Education. The Department of Health, Education, and Welfare (HEW) became the Department of Health and Human Services (HHS) on May 4, 1980.

The Health Care Financing Administration was created to manage Medicare and Medicaid separately from the Social Security Administration.

Worldwide eradication of smallpox, led by the U.S. Public Health Service.

Child Support Enforcement and Paternity Establishment Program was established.

National Cancer Act was signed into law.

National Health Service Corps was created.

International Smallpox Eradication program was established.

Community Health Center and Migrant Health Center programs were launched.

Medicare and Medicaid programs were created, making comprehensive health care available to millions of Americans.

Older Americans Act created the nutritional and social programs administered by HHS' Administration on Aging.

Head Start program was created.

Release of the first Surgeon General's Report on Smoking and Health.

Migrant Health Act was passed, providing support for clinics serving agricultural workers.

First White House Conference on Aging.

Licensing of the Salk polio vaccine.

Indian Health Service was transferred to HHS from the Department of Interior.

The Cabinet-level Department of Health, Education, and Welfare (HEW) was created under President Eisenhower, officially coming into existence April 11, 1953. In 1979, the Department of Education Organization Act was signed into law, providing for a separate Department of Education. HEW became the Department of Health and Human Services, officially arriving on May 4, 1980.

Communicable Disease Center was established, forerunner of the Centers for Disease Control and Prevention.

The Federal Security Agency was created, bringing together related federal activities in the fields of health, education, and social insurance.

Federal Food, Drug, and Cosmetic Act was passed.

Social Security Act was passed.

The National Institute (later Institutes) of Health was created out of the Public Health Service's Hygienic Laboratory.

The Bureau of Indian Affairs Health Division, forerunner to the Indian Health Service, was created.

President Theodore Roosevelt's first White House Conference urged creation of the Children's Bureau to combat exploitation of children.

The Pure Food and Drugs Act was passed, authorizing the government to monitor the purity of foods and the safety of medicines, now a responsibility of the FDA.

Conversion of the Marine Hospital Service into the Public Health and Marine Hospital Service in recognition of its expanding activities in the field of public health. In 1912, the name was shortened to the Public Health Service.

Immigration legislation was passed, assigning the Marine Hospital Service the responsibility for medical examination of arriving immigrants.

The federal government opened a one-room laboratory on Staten Island for research on disease, a very early precursor to the National Institutes of Health.

The National Quarantine Act was passed, beginning the transfer of quarantine functions from the states to the federal Marine Hospital Service.

Appointment of the first Supervising Surgeon (later called the Surgeon General) for the Marine Hospital Service, which had been organized the prior year.

President Lincoln appointed a chemist, Charles M. Wetherill, to serve in the new Department of Agriculture. This was the beginning of the Bureau of Chemistry, forerunner to the Food and Drug Administration.


Our History - Our Story

Laboratory at 291 Peachtree Street, Atlanta, Georgia, 1945.
Aimee Wilcox & Laboratory Director, Dr. Seward Miller.

On July 1, 1946 the Communicable Disease Center (CDC) opened its doors and occupied one floor of a small building in Atlanta. Its primary mission was simple yet highly challenging: prevent malaria from spreading across the nation. Armed with a budget of only $10 million and fewer than 400 employees, the agency&rsquos early challenges included obtaining enough trucks, sprayers, and shovels necessary to wage war on mosquitoes.

As the organization took root deep in the South, once known as the heart of the malaria zone, CDC Founder Dr. Joseph Mountin continued to advocate for public health issues and to push for CDC to extend its responsibilities to other communicable diseases. He was a visionary public health leader with high hopes for this small and, at that time, relatively insignificant branch of the Public Health Service. In 1947, CDC made a token payment of $10 to Emory University for 15 acres of land on Clifton Road in Atlanta that now serves as CDC headquarters. The new institution expanded its focus to include all communicable diseases and to provide practical help to state health departments when requested.

Although medical epidemiologists were scarce in those early years, disease surveillance became the cornerstone of CDC&rsquos mission of service to the states and over time changed the practice of public health. There have been many significant accomplishments since CDC&rsquos humble beginnings. The following highlights some of CDC&rsquos important achievements for improving public health worldwide.

Today, CDC is one of the major operating components of the Department of Health and Human Services and is recognized as the nation&rsquos premiere health promotion, prevention, and preparedness agency.

A look at CDC's significant contributions to public health, from 1946 to now.


1869 - Friedrich Miescher identifies "nuclein"

In 1869, Swiss physiological chemist Friedrich Miescher first identified what he called "nuclein" in the nuclei of human white blood cells, which we know today as deoxyribonucleic acid (DNA).

Miescher's original plan had been to isolate and characterise the protein components of white blood cells. To do this, he had made arrangements for a local surgical clinic to send him pus-saturated bandages, which he planned to wash out before filtering the white blood cells and extracting their various proteins.

However, during the process, he came across a substance that had unusual chemical properties unlike the proteins he was searching for, with very high phosphorous content and a resistance to protein digestion.

Miescher quickly realised that he had discovered a new substance and sensed the importance of his findings. Despite this, it took more than 50 years for the wider scientific community to appreciate his work.

1900's


The 'Lucy' fossil rewrote the story of humanity

Forty years ago, on a Sunday morning in late November 1974, a team of scientists were digging in an isolated spot in the Afar region of Ethiopia.

Surveying the area, paleoanthropologist Donald Johanson spotted a small part of an elbow bone. He immediately recognised it as coming from a human ancestor. And there was plenty more. "As I looked up the slopes to my left I saw bits of the skull, a chunk of jaw, a couple of vertebrae," says Johanson.

It was immediately obvious that the skeleton was a momentous find, because the sediments at the site were known to be 3.2 million years old. "I realised this was part of a skeleton that was older than three million years," says Johanson. It was the most ancient early human &ndash or hominin &ndash ever found. Later it became apparent that it was also the most complete: fully 40% of the skeleton had been preserved.

Might Lucy be our direct ancestor, a missing gap in the human family tree?

At the group's campsite that night, Johanson played a Beatles cassette that he had brought with him, and the song "Lucy in the Sky with Diamonds" came on. By this time Johanson thought the skeleton was female, because it was small. So someone said to him: "why don't you call it Lucy?" The name stuck immediately. "All of a sudden," says Johanson, "she became a person."

It would be another four years before Lucy was officially described. She belonged to a new species called Australopithecus afarensis, and it was clear that she was one of the most important fossils ever discovered.

But at the campsite the morning after the discovery, the discussion was dominated by questions. How old was Lucy when she died? Did she have children? What was she like? And might she be our direct ancestor, a missing gap in the human family tree? Forty years later, we are starting to have answers to some of these questions.

Though she was a new species, Lucy was not the first Australopithecus found. That was the Taung Child, the fossilised skull of a young child who lived about 2.8 million years ago in Taung, South Africa. The Taung Child was discovered in 1924 and was studied by anatomist Raymond Dart. He realised that it belonged to a new species, which he called Australopithecus africanus.

The Taung Child was denounced as just an ape and of no major importance

Dart wrote: "I knew at a glance that what lay in my hands was no ordinary anthropoidal brain. Here in lime-consolidated sand was the replica of a brain three times as large as that of a baboon and considerably bigger than that of an adult chimpanzee&hellip" The Taung Child's teeth were more like a human child's than an ape's. Dart also concluded that it could walk upright, like humans, because the part of the skull where the spinal cord meets the brain was human-like.

The Taung Child was the first hint that humans originated from Africa. But when Dart published his analysis the following year, he came in for stiff criticism. At the time, Europe and Asia was thought to be the crucial hub for human evolution, and scientists did not accept that Africa was an important site. The Taung Child was denounced by the prominent anatomist Sir Arthur Keith as just an ape and of no major importance.

Over the next 25 years, more evidence emerged and showed that Dart had been right all along. By the time Lucy came along, anthropologists accepted that australopithecines were early humans, not just apes. So upon her discovery, Lucy became the oldest potential ancestor for every known hominin species. The immediate question was: what was she like?

Lucy had an "incredible amalgam of more primitive and more derived features that had not been seen before," says Johanson. Her skull, jaws and teeth were more ape-like than those of other Australopithecus. Her braincase was also very small, no bigger than that of a chimp. She had a hefty jaw, a low forehead and long dangly arms.

There's no other mammal that walks the way we do

For Johanson, in the field at Hadar, it was immediately apparent that Lucy walked upright, like the Taung Child. That's because the shape and positioning of her pelvis reflected a fully upright gait. Lucy's knee and ankle were also preserved and seem to reflect bipedal walking. Later studies of A. afarensis feet offer even more evidence.

As an upright walker, Lucy strengthened the idea that walking was one of the key selective pressures driving human evolution forwards. The first hominins did not need bigger brains to take defining steps away from apes. Extra brainpower only came over a million years later with the arrival of Homo erectus. Though big brains would clearly be important later, walking remains one of the traits that makes us uniquely human.

"There's no other mammal that walks the way we do," says William Harcourt-Smith of the American Museum of Natural History in New York. "Without bipedalism one starts to wonder what would have happened to our lineage. Would we have happened at all?"

She may have walked like a human, but Lucy spent at least some of her time up in the trees, as chimpanzees and orang-utans still do today. It may be that upright walking evolved in the trees, as a way to walk along branches that would otherwise be too flexible.

It's not clear why Lucy left the safety of the trees and took to the ground. It is thought that savannahs were gradually opening up, so trees were spaced further apart. But the real reason for heading to the ground may have been to search for food, says Chris Stringer of the Natural History Museum in London, UK. In line with this idea, recent evidence suggests that australopithecines' diet was changing.

Lucy herself may have been collecting eggs from a lake

Studies of the remains of food trapped on preserved hominin teeth show that several species, including Lucy's, were expanding their diet around 3.5 million years ago. Instead of mostly eating fruit from trees, they began to include grasses and sedges, and possibly meat. This change in diet may have allowed them to range more widely, and to travel around more efficiently in a changing environment.

Lucy herself may have been collecting eggs from a lake. Fossilised crocodile and turtle eggs were found near her skeleton, leading to suggestions that she died while foraging for them.

An ape with butchering skills

How did australopithecines process all these new foods? Later species like Homo erectus are known to have used simple stone tools, but no tools have ever been found from this far back. However, in 2010 archaeologists uncovered animal bones with markings that seem to have been made by stone tools. That suggests Lucy and her relatives used stone tools to eat meat.

Chimpanzees learn about tool use from their mothers

There have since been heated debates over whether or not the marks were really made by tools. But if they were, it is not really surprising, says Fred Spoor of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Spoor points out that modern chimpanzees use several tools, for instance to crack nuts. So if chimps can do it, Spoor says we might expect that A. afarensis &ndash which was basically a "bipedal chimpanzee" &ndash could too. Chimpanzees learn about tool use from their mothers, and Lucy could have picked it up in a similar way.

It would be more impressive if Lucy's species had also manufactured tools, but there is no evidence of that. "Cut marks don't imply a stone has been beautifully modelled into a knife," says Spoor. "It could be a sharp stone that has scraped muscle and fat from a bone."

As well as learning skills from her mother, Lucy may well have learned from other A. afarensis. Later fossil finds from the Hadar area, and comparisons with other primates, suggest that Lucy lived in a small social group. Chimpanzees also live in groups of a few dozen individuals, and A. afarensis may have stuck with this system.

Lucy's childhood was much shorter than ours

Lucy was small compared to males of her species. That has led some researchers to suggest that her society was male-dominated. It may even have been polygamous, like gorilla groups today. In general, males are only significantly larger than females in species where one male can control several females. So Lucy may have lived in a group controlled by one dominant male, who had "a harem, or group of females around it," says Spoor.

It also seems that Lucy's childhood was much shorter than ours, and that she had to fend for herself from a young age.

We know that Lucy was a fully-grown adult, because she had wisdom teeth and her bones had fused. But unlike modern humans, she seems to have grown to full size very quickly, and was only about 12 years old when she died. In line with that, a 2006 study of a 3-year-old A. afarensis suggested that their brains reached their full size much earlier than ours do.

All in all, Lucy looks like a halfway house between apes and humans. She was ape-like in appearance and brain size, but she could walk upright like more advanced hominins that lived later. So where exactly does she fit into our family tree?

There were many species of early hominin, often living side by side

When she was discovered, Lucy was hailed as the oldest direct ancestor of modern humans. "A. afarensis took us one small step closer to that common ancestor we share with chimpanzees," says Tim White of the University of California, Berkeley. "We knew we were genetically incredibly close to chimpanzees, with the last common ancestor we shared with them estimated to be around six million years ago. Lucy had closed a gap in our knowledge."

It now looks like Lucy did not take us as close to our common ancestor with chimps as everyone thought. The latest genetic studies suggest we actually split from chimpanzees much earlier, perhaps as much as 13 million years ago. If that is true, the 3-million-year-old Lucy arrived quite late in the story of human evolution. Older fossils, such as the 4.4-million-year-old Ardipithecus described by White and his colleagues, are closer to our ape ancestors.

But a bigger problem for the idea that A. afarensis were our direct ancestors is that our lineage has turned out to be very complicated. There were many species of early hominin, often living side by side and possibly even interbreeding. When Lucy was found, about seven early hominins were known. Now there are at least 20. We simply don't know which ones eventually led to Homo sapiens, and which were evolutionary dead ends.

It is not even clear where in Africa modern humans evolved. Lucy suggested that Ethiopia was a crucial site. But in 2008 another species of Australopithecus, A. sediba, was discovered in South Africa. It lived around 2 million years ago, around when the Homo genus first emerged. The Taung Child also hailed from the same area, so the find suggested that South Africa could have been our species' birthplace.

We may never find our true ancestor

Despite this, White says Lucy's species is still the best candidate for a direct ancestor, but that more fossil evidence from that time is needed. "I am confident that the fossils will be found in that interval, because I know that in Ethiopia there are already four study areas with fossiliferous sediments of that age," he says.

Other species like Kenyanthropus platyops, which lived 3.5 million years ago, could also be the ancestor, says Stringer. It could also be a fossil that we haven't found yet.

Spoor is even more cautious and says we may never find our true ancestor, because we will only ever find a fraction of life that once existed. But Lucy certainly comes "pretty close", he says.

Lucy's discovery marked a turning point in our understanding of human evolution. Even today scientists are still learning from her. Paleoanthropologists can visit her in Ethiopia's National Museum in Addis Ababa, to run further analyses using new technologies. "She'll keep on giving," says Harcourt-Smith.

Her place in human evolution is assured

According to Johanson, perhaps her most important contribution was to "spark" a wave of research that has led to the discovery of many new species, like Ardipithecus and A. sediba. The number of known species has more than doubled since Lucy, but many parts of the story still need to be filled in, says Johanson. "I know there are several others [species] lurking on the horizon."

Thanks to all these discoveries, we now know that the evolutionary process that led to us was not linear. There was a lot of variation and experimentation along the way, with many species being driven to extinction &ndash most famously the Neanderthals. Johanson says modern humans, for all our abilities, may have been fortunate to have survived it all.

Members of his team will soon be digging for fossils in the Afar region of Ethiopia, close to Lucy's home, as they do each year. It seems likely that this area has more fossils to offer. Even if it doesn't, many fossils that are more complete than Lucy, and much older, have been found since 1974. Nevertheless, Stringer says that "her place in human evolution is assured for the long term."

Donald Johanson spoke to Radio 4's BBC Inside Science. Listen to the full interview.


While You Are Ringing In The Summer, Don't Forget To Remember The Importance Of What We Have Off For.

Home of the free because of the brave.

"The American flag does not fly because the wind moves it. It flies from the last breath of each solider who died protecting it."

On this present day in America, we currently have over 1.4 million brave men and women actively listed in the armed forces to protect and serve our country.

Currently there is an increased rate of 2.4 million retiree's from the US military

Approximately, there has been over 3.4 million deaths of soldiers fighting in wars.

Every single year, everyone look's forward to Memorial Day Weekend, a weekend where beaches become overcrowded, people fire up them grills for a fun sunny BBQ, simply an increase of summer activities, as a "pre-game" before summer begins.

Many American's have forgot the true definition of why we have the privilege to celebrate Memorial Day.

In simple terms, Memorial Day is a day to pause, remember, reflect and honor the fallen who died protecting and serving for everything we are free to do today.

Thank you for stepping forward, when most would have stepped backwards.

Thank you for the times you missed with your families, in order to protect mine.

Thank you for involving yourself, knowing that you had to rely on faith and the prayers of others for your own protection.

Thank you for being so selfless, and putting your life on the line to protect others, even though you didn't know them at all.

Thank you for toughing it out, and being a volunteer to represent us.

Thank you for your dedication and diligence.

Without you, we wouldn't have the freedom we are granted now.

I pray you never get handed that folded flag. The flag is folded to represent the original thirteen colonies of the United States. Each fold carries its own meaning. According to the description, some folds symbolize freedom, life, or pay tribute to mothers, fathers, and children of those who serve in the Armed Forces.

As long as you live, continuously pray for those families who get handed that flag as someone just lost a mother, husband, daughter, son, father, wife, or a friend. Every person means something to someone.

Most Americans have never fought in a war. They've never laced up their boots and went into combat. They didn't have to worry about surviving until the next day as gunfire went off around them. Most Americans don't know what that experience is like.

However, some Americans do as they fight for our country every day. We need to thank and remember these Americans because they fight for our country while the rest of us stay safe back home and away from the war zone.

Never take for granted that you are here because someone fought for you to be here and never forget the people who died because they gave that right to you.

So, as you are out celebrating this weekend, drink to those who aren't with us today and don't forget the true definition of why we celebrate Memorial Day every year.

"…And if words cannot repay the debt we owe these men, surely with our actions we must strive to keep faith with them and with the vision that led them to battle and to final sacrifice."


Early Stone Age Tools

The earliest stone toolmaking developed by at least 2.6 million years ago. The Early Stone Age includes the most basic stone toolkits made by early humans. The Early Stone Age in Africa is equivalent to what is called the Lower Paleolithic in Europe and Asia.

The oldest stone tools, known as the Oldowan toolkit, consist of at least:
• Hammerstones that show battering on their surfaces
• Stone cores that show a series of flake scars along one or more edges
• Sharp stone flakes that were struck from the cores and offer useful cutting edges, along with lots of debris from the process of percussion flaking

By about 1.76 million years ago, early humans began to strike really large flakes and then continue to shape them by striking smaller flakes from around the edges. The resulting implements included a new kind of tool called a handaxe. These tools and other kinds of ‘large cutting tools’ characterize the Acheulean toolkit.

The basic toolkit, including a variety of novel forms of stone core, continued to be made. It and the Acheulean toolkit were made for an immense period of time – ending in different places by around 400,000 to 250,000 years ago.


5 Things That Educators Should Know About the Philosophy of Education

The word philosophy is derived from two Greek words. The first word, philo, means “love.” The second, sophy, means “wisdom.” Literally, then, philosophy means “love of wisdom” (Power, 1982). Each individual has an attitude toward life, children, politics, learning, and previous personal experiences that informs and shapes their set of beliefs. Although you may not be conscious of it, this set of beliefs, or personal philosophy, informs how you live, work, and interact with others. What you believe is directly reflected in both your teaching and learning processes. This article explores the various philosophical views influence the teaching profession.

It is important to understand how philosophy and education are interrelated. In order to become the most effective teacher you can be, you must understand your own beliefs, while at the same time empathizing with others. In this chapter we will examine the study of philosophy, the major branches of philosophy, and the major philosophical schools of thought in education. You will have a chance to examine how these schools of thought can help you define your personal educational philosophy. Developing your own educational philosophy is a key part of your journey to becoming a teacher. In this article, we will discuss the 5 things that educators should know about the philosophy of education.

What are the major branches of philosophy? The four main branches of philosophy are metaphysics, epistemology, axiology, and logic. Metaphysics considers questions about the physical universe and the nature of ultimate reality. Epistemology examines how people come to learn what they know. Axiology is the study of fundamental principles or values. Logic pursues the organization of the reasoning process. Logic can be divided into two main components: deductive reasoning, which takes general principles and relates them to a specific case and inductive reasoning, which builds up an argument based on specific examples.

What are the major schools of thought in philosophy? Idealism can be divided into three categories: classical, religious, and modern. Classical idealism, the philosophy of the Greeks Socrates and Plato, searches for an absolute truth. Religious idealism tries to reconcile God and humanity. Modern idealism, stemming from the ideas of Descartes, links perception and existence.

Realism, the school of thought founded by Aristotle, believes that the world of matter is separate from human perceptions. Modern realist thought has led to the “blank slate” notion of human capabilities. Pragmatism believes that we should select the ideas, actions, and consequences with the most desirable outcome, as well as learning from previous experiences to achieve desirable consequences. John Dewey’s Experimentalism brought the scientific method of inductive reasoning to the educational sphere.

Postmodernism and existentialism focus on intricate readings of texts and social and political conventions, examining existing structures for flaws. Essentially, they focus heavily on the present, and on understanding life as we know it. Jacques Derrida’s deconstruction methods of reading texts suggests that universal rationality is not found in objective reality, but in the text. Michel Foucault, another postmodern philosopher, examined the relationship between truth and power.

What are the major philosophies of education? The major philosophies of education can be broken down into three main types: teacher-centered philosophies, student-centered philosophies, and society-centered philosophies. These include Essentialism, Perennialism, Progressivism, Social Reconstructionism, Existentialism, Behaviorism, Constructivism, Conservatism, and Humanism.

Essentialism and Perennialism are the two types of teacher-centered philosophies of education. Essentialism is currently the leading style of public education in the United States. It is the teaching of basic skills that have been proven over time to be needed in society. Perennialism focuses on the teaching of great works.

There are three types of student-centered philosophies of education. Progressivism focuses on developing the student’s moral compass. Humanism is about fostering each student to his or her fullest potential. Constructivism focuses on using education to shape a student’s world view.

There are two types of socially-centered philosophies of education. Reconstructionism is the perspective that education is the means to solve social problems. Behaviorism focuses on cultivating behaviors that are beneficial to society.

What additional ideologies of educational philosophy exist? Other notable ideologies of educational philosophy include Nationalism, American Exceptionalism, Ethno-nationalism, Liberalism, Conservatism, and Marxism. Nationalism is a national spirit, or love of country, that ties the interests of a nation to the symbols that represent it. American Exceptionalism is a form of Nationalism that implies that the United States is a special country that is privileged to have a manifest destiny. Ethno-nationalism is similar to nationalism, but rather than the loyalty lying with one’s nation, it lies with one’s ethnic or racial group. Liberalism is the ideology that people should enjoy the greatest possible individual freedoms and that it should be guaranteed by due process of law. The opposite of liberalism is conservatism. Conservatism is the belief that institutions should function according to their intended original purpose and any concepts that have not been maintained should be restored. Finally, Marxism is an ideological and political movement that focuses on the class system as a form of conflict within the social, political, and educational realms.

How is an educator’s educational philosophy determined? It is important to identify your own philosophy of education in order to understand your own system of values and beliefs so that you are easily able to describe your teaching style to potential employers.

While writing your own personal philosophy of education statement, it is vital to address several key components: How do I think? What is the purpose of education? What is the role of the teacher? How should the teacher teach? What is the role of the student? What should be taught? Additionally, make sure that you be yourself and are clear and concise. Do some research about the school you are applying for and address their missions and goals in your statement. Remember that education is about the students and also remember to focus on your discipline. Think of the great teachers you have had in your life. Remember to get feedback. Additionally, don’t make it long and don’t ramble. Don’t rehash your resume, be a know-it-all, or use strong statements.


GEOLOGICAL TIME, THE EVERYDAY, AND THE QUESTION OF THE POLITICAL

The Anthropocene, as Nigel Clark puts it bluntly, “confronts the political with forces and events that have the capacity to undo the political.” He invites humanists to “embrace the fully inhuman” in their thoughts, putting them “in sustained contact with times and spaces that radically exceed any conceivable human presence.”97 97 Nigel Clark, “Geo-politics and the Disaster of the Anthropocene,” Sociological Review 62 (S1) (2014), 27–28. See also Nigel Clark, “Politics of Strata,” Theory, Culture and Society 34, no. 2–3 (2017), 1–21, special issue on Geosocial Formations and the Anthropocene.
The Anthropocene, in one telling, is a story about humans. But it is also, in another telling, a story of which humans are only parts, even small parts, and not always in charge. How to inhabit this second Anthropocene so as to bring the geological into human modes of dwelling are questions that remain. It could indeed take “decades, even centuries,” Jasanoff warns, “to accommodate to … a revolutionary reframing of human–nature relationships.”98 98 Jasanoff, “A New Climate,” 237.

As I have tried to demonstrate, one obstacle to contemplating such accommodation—and the related question of human vulnerability—is the attachment in much contemporary thought to a very particular construction of “the political” while the task may be, precisely, to reconfigure it. This attachment functions as a fearful and anxious injunction against thinking the geobiological, lest we end up “anesthetizing” or “paralyzing” the political itself.99 99 For a beginning, see Nigel Clark and Yasmin Gunaratnam, “Earthing the Anthropos? From ‘Socializing the Anthropocene’ to Geologizing the Social,” European Journal of Social Theory 20, no. 1 (2017), 111–131, and Bronislaw Szerszynski, “The Anthropocene Monument: On Relating Geological and Human Time,” European Journal of Social Theory 20, no. 1 (2017), 146–163.
Humans cannot afford to give up on the political (and on our demands for justice between the more powerful and the less), but we need to resituate it within the awareness of a predicament that now marks the human condition. Political thought has so far been human-centric, holding constant the “world” outside of human concerns or treating its eruptions into the time of human history as intrusions from an “outside.” This “outside” no longer exists. What is “just” for humans over one period of time may imperil our existence over another. Besides, Earth system science has revealed how critically entangled human lives are with the geo-bio-chemical processes of the planet. Our concerns for justice cannot any longer be about humans alone, but we don't yet know how to extend these concerns to the universe of nonhumans (that is, not just a few species). There is also the task of having to bring within the grasp of the affective structures of human-historical time the vast scales of the times of geobiology that these structures do not usually engage. Our evolution did not prepare us for these tasks either, as the biologist David Reznick explains:

One useful perspective for envisioning what “sudden” means in geology is to think about how the world is changing today. We are in the midst of the sixth mass extinction. One hundred million years from now, the fossil record of our time will reveal dramatic evidence of the dispersal of humans … around 100,000 years ago, … the spread of agriculture beginning around 10,000 years ago, the advent of the industrial revolution, then the super-exponential growth of the human population. The current extinction event began during the Pleistocene with the beginning of the decline of the mammalian megafauna…. Then there was a global decline of forests, expansion of deserts and grasslands, accumulation of industrial wastes, and an accelerating rate of extinction…. The reason why we do not sense cataclysm, even though the geological record is certain to preserve it this way, is because of the difference in the time frame of our lives versus the time frame of the geological record. To us, 100 years is a long time. In the fossil record, 100,000 or even a million years can appear as an instant.100 100 Reznick, The Origin, 311.

One can see the attractions today of folding the narrative of climate change into the familiar structures of intra-human concerns of the political that have been part of modernity since the seventeenth century and that were extended and deepened in the era that saw great waves of decolonization, civil liberties movements, feminist movements, agitations for human rights, and globalization. But all that was before the news of anthropogenic climate change broke in on the world of humanists. Anthropocene time puts pressure on another question: What does it mean to dwell, to be political, to pursue justice when we live out the everyday with the awareness that what seems “slow” in human and world-historical terms may indeed be “instantaneous” on the scale of Earth history, that living in the Anthropocene means inhabiting these two presents at the same time? I cannot fully or even satisfactorily answer the question yet, but surely we cannot even begin to answer it if “the political” keeps acting as an anxious prohibition on thinking of that which leaves us feeling “out-scaled.”101 101 Thanks to Timothy Morton for discussions we had on this point when I visited Rice University a few years ago.

Our sense of the planet has been profoundly based on what Edmund Husserl once famously called the “ontic certainty” of the world that human beings enjoyed. “The world is pregiven to us,” he wrote, “the waking, always somehow practically interested subjects … To live is always to live-in-certainty-of-the-world. Waking life is being awake to the world, being constantly and directly “conscious” of the world and of oneself as a living in the world, actually experiencing [erleben] and actually effecting the ontic certainty of the world.”102 102 Edmund Husserl, The Crisis of European Sciences and Transcendental Phenomenology: An Introduction to Phenomenological Philosophy, transl. David Carr (Evanston, IL: Northwestern University Press, 1970), 142–143.
He would repeat the point in his short essay on “The Origin of Geometry,” the famous 1936 text that was included as appendix to his Vienna lectures of 1934.103 103 Edmund Husserl, “The Origin of Geometry” appended to ibid., 358.
The earth that corresponds to our everyday world-horizon cannot be an object of any objective science.

Jacques Derrida quotes from a Husserl “fragment” entitled (in English translation) “Fundamental Investigations on the Phenomenological Origin of the Spatiality of Nature” in which Husserl makes a distinction between the Copernican view of the world—(embodying some of the “planet-centered” view that Zalasiewicz mentioned) in which “we Copernicans, … men of modern time, … say the earth is not ‘the Whole of Nature,’ it is but one of the planets in the indefinite space of the world”—and our everyday relationship to the Earth. “The earth as a spherical body … certainly is not perceptible as a whole, by a single person and all at once,” he remarks. It is perceptible only “in a primordial synthesis as the unity of singular experiences bound to each other” though “it may be the experiential ground for all bodies in the experiential genesis of our world-objectification.” This Earth, Husserl asserts, cannot move: “It is on the Earth, toward the Earth, starting from it, but still on it that motion occurs. The Earth itself, in conformity to the original idea of it, does not move, nor is it at rest it is in relation to the Earth that motions and rest first have sense.” The unity of this primordial Earth arises out of the unity of all humanity. Even if we looked at the Earth from another planet, then we would have “two pieces of a single Earth with one humanity,” for, as Derrida remarks, “the unity of all humanity determines the unity of the ground [the Earth] as such.”104 104 Jacques Derrida, Edmund Husserl's Origin of Geometry: An Introduction, transl. and preface by John Leavey Jr., ed. David B. Allen (New York: Nicolas Hay, 1979), 83–84.

Climate change challenges this ontic certainty of the Earth that humans have enjoyed through the Holocene epoch and perhaps for longer. Our everyday thoughts have begun to be oriented—thanks again to the current dissemination of geological terms such as the Anthropocene in public culture—by the geological fact that the Earth that Husserl took for granted as the stable and unshakable ground from which all human thoughts (even Copernican ones) arose actually has always been a fitful and restless entity in its long journey through the depths of geological time.105 105 Jan Zalasiewicz and Mark Williams, The Goldilocks Planet: The Four Billion Year Story of the Earth's Climate (Oxford: Oxford University Press, 2012).
It is not that we have not known of catastrophes in the geological history of the planet. We have, but the knowledge did not affect our quotidian sense of an innate assurance that the Earth provides a stable ground on which we project our political purposes. The Anthropocene disturbs that certainty by bringing the geological into the everyday. Nigel Clark makes this observation one of the starting points for his fascinating book, Inhuman Nature, by noticing how scientific facts can never entirely displace the “visceral trust in earth, sky, life, and water” that humans come to possess and yet see how all four of Clark's terms are under question today: we do not know if the Earth (or Earth system) will honor our trust as we warm her up by emitting greenhouse gases into the sky if fresh water will run short, and if life, as some predict, will be threatened with a sixth great extinction.106 106 Nigel Clark, Inhuman Nature: Sociable Life on a Dynamic Planet (London: Sage, 2011), 5. I am indebted to Clark for drawing our attention to the Husserl text I discuss here.

Wittgenstein once said: “We see men building and demolishing houses, and are led to ask: ‘How long has this house been here?’ But how does one come on the idea of asking [that] about a mountain, for example?”107 107 Ludwig Wittgenstein, On Certainty, ed. G. E. M. Anscombe and G. H. von Wright, transl. Denis Paul and G. E. M. Anscombe [1969] (New York: Harper, 1972), 13e.
Perhaps I can provide a historian's answer to Wittgenstein's question. A time has come when the geological and planetary press in on our everyday consciousness as when we speak of there being “excess” carbon dioxide in the atmosphere—“excess” only on the scale of human concerns—or of renewable and nonrenewable sources of energy (nonrenewable on human time scales). For humanists living in such times and contemplating the Anthropocene, questions about histories of volcanoes, mountains, oceans, and plate tectonics—the history of the planet, in short—have become as routine in the life of critical thought as questions about global capital and the necessary inequities of the world that it made.


Rashawn Ray

David M. Rubenstein Fellow - Governance Studies

The context behind the march is significant. The 600-person civil rights march was actually about police brutality. Jimmie Lee Jackson, a 26-year-old church deacon, was killed by James Bonard Fowler, a state trooper in Alabama. This march also occurred a year and a half after the infamous March on Washington highlighting that little had changed in the lives of Black people in America. Bloody Sunday was highlighted in Ava Duvernay’s Oscar-nominated best picture film Selma. Musicians John Legend and Common won an Oscar for the song “Glory.”

Bloody Sunday is often noted as a pinnacle of Lewis’ life. This defining moment encapsulates five things he taught us about getting in good trouble.

Vote, always

“Your vote matters. If it didn’t, why would some people keep trying to take it away? #goodtrouble” Lewis sent this tweet on July 3, 2018. It highlights his life’s work—equitable voting. One major part of the Civil Rights Movement was Black people gaining the right to vote. This finally occurred with the Voting Rights Act of 1965. But the Shelby v Holder Supreme Court decision in 2013 essentially gutted the Voting Rights Act and paved the way for widespread voter suppression and gerrymandering.

This is why it is imperative for Congress to act swiftly to pass the John Lewis Voting Rights Advancement Act to ensure equitable access to the polls. Lewis was an original Freedom Rider, participated in many sit-ins, and was arrested dozens of times for people to have the right to vote. “Some of us gave a little blood for the right to participate in the democratic process,” said Lewis. Now, Congress must honor Lewis’ legacy and ensure an equitable participation in the democratic process. As Lewis noted, “The vote is precious. It is almost sacred. It is the most powerful non-violent tool we have in a democracy.”

Never too young to make a difference

As a founder and leader of the Student Non-violent Coordinating Committee (SNCC), Lewis was the youngest person to speak at the March on Washington. Elder civil rights leaders aimed to taper his words. Lewis was critical of the Kennedy administration and the slowness by which broad scale legislation change was occurring at the federal level. Lewis also critiqued civil rights legislation for not addressing police brutality against Black people. Imagine how this moment in the Movement for Black Lives may be different had elder Civil Rights leaders listened to Lewis. Lewis’ youth gave him a vision for a more transformative society that was mostly socialized out and, in some cases beaten out, of older leaders. Lewis teaches us that age is nothing but a number and young people have to be the change they want to see by pushing and forcing older people for equitable change. Older people are often socialized in the current arrangement of society and cannot fully envision a radically different world. Lewis stated, “I want to see young people in America feel the spirit of the 1960s and find a way to get in the way. To find a way to get in trouble. Good trouble, necessary trouble.” Young people can and should push for transformative change and hold us accountable to it.

Speak truth to power

“Speak up, speak out, get in the way,” said Lewis. He taught us the importance of speaking up and speaking out. We have to be willing to speak up about injustice, always, no matter the costs. My grandfather who served in two wars earning a Purple Heart and Bronze Star taught me from birth that my silence is my acceptance. Lewis stated, “When you see something that is not right, not fair, not just, you have to speak up. You have to say something you have to do something.” This motto should apply in all aspects of our lives. Lewis epitomizes it and encourages us to not be silent. He was adamant about supporting free speech, but he was also adamant about condemning hate speech. “I believe in freedom of speech, but I also believe that we have an obligation to condemn speech that is racist, bigoted, anti-Semitic, or hateful.”

Become a racial equity broker

Lewis is the personification of transitioning from a political activist to a politician. I frame it as transitioning from a racial equity advocate to a racial equity broker. A racial equity advocate speaks up and speaks out, stands in the gap, and sits at the table to advocate for people who cannot advocate for themselves. There is a saying— “If you are not at the table, you are on the menu and someone is eating you for lunch.” Shirley Chisholm said, “If they don’t give you a seat at the table, bring a folding chair.” Lewis realized that to make transformative change, he had to be at the table and often bring his own chair. Once at the table, he realized that he needed to help draft the documents that got discussed at the table. This led him to becoming an elected official and a racial equity broker to alter, deconstruct, and restructure the laws, policies, procedures, and rules that inhibit racial equity.

Never give up

When Lewis was elected to Congress in 1986, one of his first bills was the creation of a national museum to chronicle the history, culture, and successes of Black Americans. The culmination of this bill was passed in 2003 and opened in 2016 as the National Museum of African American History and Culture. Lewis taught us persistence. He taught us that when a person has transformative ideas, they should not taper those ideas. Instead, they should push those ideas until others get on board. Simply because change is slow does not mean change agents have to move slowly towards it. Lewis was a lightning bolt for equity, social change, and social justice. We must continue his legacy, never forget history, pursue equity, and get in good trouble.


Watch the video: Πέντε δρόμοι ενός ανθρώπου του ενός βιβλίου. (May 2022).


Comments:

  1. Salbatore

    This is ridiculous.

  2. Torrian

    This lottery?

  3. Cenewig

    I believe, that you are not right.

  4. Gedaliah

    I congratulate, your opinion is useful

  5. Brun

    What the right words ... super, brilliant idea

  6. Sonnie

    Cool, I'm moved)



Write a message