Concept of Knowledge Revisited

Discussion on R&R from all regions
Post Reply
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

THE BIG IDEAS: WHY DOES ART MATTER?

Five Theses on Creativity

It permeates life, and, like love, it can break your heart.


The word “art” can seem pretentious: When people hear it, they worry someone will force them to read a novel, or go to a museum, or see a movie without any explosions in it.

To me, art simply refers to those aspects of our lives that can be suffused and transformed by creativity. And having creativity in our lives is important. Without it we’re just going through the motions, stuck in the past. With it we feel alive, even joyous.

But if I say that art is simply life imbued with creativity, isn’t that just a case of obscurum per obscurius — of explaining the murky with the even murkier? After all, what exactly is creativity?

To help unravel this puzzle, here are five theses on creativity:

Thesis No. 1: Creativity makes something new. A different way of talking can suddenly make our world seem new. Here’s an example: In the Middle Ages, a road was something people walked on, the ocean a terrifying expanse of blue. But when the anonymous author of the Old English epic poem “Beowulf” called the ocean a “whale-road,” he made his readers experience the ocean afresh. The ocean may be an obstacle for us land-bound humans, but for whales it’s a road.

Thesis No. 2: Creativity hides itself. Creativity is shy. It’s easy to miss that creativity is about making something new, because, as soon as we succeed, the new thing we’ve created appears obvious, as if it had always been there. “Whale” and “road” were just there hanging around when someone said “whale-road.” And then people said, “Of course! The ocean may be a barrier for us, but it’s not for whales. They swim in it.” All that one person did was say what there was to be said — except it wasn’t there to be said, until he or she said it.

Creativity can seem like a tool for solving problems: We need a new word for the ocean! But creativity doesn’t just solve problems; it also makes or discovers new problems to solve. Hundreds of years ago, nobody knew the old words for ocean weren’t cutting it, until someone said “whale-road.” And everyone was like, “Wow! It is a whale-road!” Creativity always hides itself — it makes itself disappear.

That’s a helpful point to keep in mind when thinking about science, because creativity is fundamental there, too. We tend to think of science as a series of nonoptional statements about how the world works — as a collection of things we must believe. But if that’s true, how can scientists be creative? They can’t really say anything new; they just have to passively express things as they are.

But, of course, that isn’t how science works at all. We actually have to create it. When Newton came up with his second law of motion (force equals mass times acceleration) he was being just as creative as the person who came up with “whale-road.” And as with “whale-road,” Newton’s creativity was concealed by the success of his creative act: His formulation pointed toward something that already existed, but also didn’t. The more successful we are, the more it will seem like the things we created didn’t need to be created. Creativity hides.

Thesis No. 3: Creativity permeates life. Creativity fills our lives like ocean water fills the grains of a sand castle — saturating the spaces between this moment and the next, this action and the next, this word and the next. As a consequence, you can be creative when you’re doing pretty much anything: You can be creative in the way you walk to work, respond to grief, make a friend, move your body when you wake up in the morning, or hum a tune on a sunny day.

We are constantly remaking our lives through acts of creativity. In fact, creativity makes life possible — just like water makes a sand castle possible. Without water a sand castle falls apart, and a life that is completely routinized and uncreative is no life at all.

Thesis No. 4: Creativity can break your heart. It’s inherently risky. You might say, “Creativity seems so joyous and fun — why isn’t everybody creative all the time? Why do people steal and plagiarize instead? Why do they follow rules when they’re trying to be creative? Why do they always make the hero a handsome man, or always make song lyrics rhyme? Why do they copy what’s worked before?”

Because creativity can fail. If you knew ahead of time that the thing you were making would work, you wouldn’t be engaged in creativity. And when it doesn’t work, it breaks your heart. You look like a fool; what’s worse, you feel like a fool. It’s very embarrassing. But you can’t get the joy of creativity without risking pain and failure — which is also true of love.

Thesis No. 5: Creativity is a kind of love. That’s why it can break your heart, and why, at the same time, it can make the world come alive. When you’re creative, you make things fresh and new; when you love someone or something, you do the same.

That’s also why creativity is shy, why it hides. We don’t want the way we love to be captured by someone else’s loveless formulation. We don’t want someone to say, “Oh, he loves everybody with blond hair” or “He loves everybody who reminds him of his mother.” We don’t like it when people think they can manipulate us by figuring out whom or what we love — it’s an insult to those we love, to us, to love itself. So we’re a bit guarded when we talk about love; we don’t want people using the way we love to take advantage of us.

Corruptio optimi pessima — the corruption of the best is the worst: Love is the best part of our lives and can permeate our entire being, but it’s the most terrible thing when it’s misused or misunderstood. It’s the same with creativity.

https://www.nytimes.com/2020/05/29/opin ... ogin-email
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Google translation of the original article in Portuguese:

https://the.ismaili/portugal/%C3%A1rvor ... nhecimento

The tree of knowledge

Many years ago, a king lived who had many problems in his kingdom. One day he met an old man who spoke of a strange and wonderful tree. Whoever ate its fruit, said the man, would become a wise man. The king immediately set out in search of this wonderful tree.

English

The king traveled everywhere, and for many months and seasons. Wherever he traveled, he found many strange trees and took a strong bite out of their fruits. The king remained the same person he had always been.

As the months passed, the king grew more and more tired. Until now he had searched all over his kingdom and tasted all kinds of fruits. But I hadn't been able to find the magic tree. When he returned, he looked for the old man who had told him about the tree of knowledge. He asked him if it really existed, to which the old man replied that he could only show him that tree if he became his student. The old man also mentioned that it would take many years and wanted to be sure if the king was willing to wait that long.

The king agreed to become a student of the old man. Day after day, the old man taught the king new and wonderful things that he had never known before. The king slowly realized that the tree of knowledge was none other than his own teacher.

From this story, we understand the important role of teachers in our lives. In addition to the teachers who teach us in our schools, universities, Dar at-Ta'lim, our parents, families, community, Prophets and Imams, all are our teachers, who give us constant guidance in all stages of our life. It is important that we understand your guidelines and strive to follow them. History also teaches us to be generous with our knowledge, both in its sharing and in its use for the greater good of humanity.

Source: Ismailis Institute of Studies, Growing and Learning, London: Islamic Publications Limited.  
swamidada_2
Posts: 297
Joined: Mon Aug 19, 2019 8:18 pm

Post by swamidada_2 »

TODAY'S PAPER | JULY 21, 2020
Education: PTI’s plan exposed
Pervez HoodbhoyUpdated 18 Jul, 2020Facebook Count

BE prepared, Pakistan! Imran Khan’s government is poised to inflict damage upon this country’s education system in a manner never seen before. Its so-called Single National Curriculum (SNC) hides systemic changes going far deeper than the ones conceived and executed by the extremist regime of Gen Ziaul Haq. Implementation is scheduled for 2021.

At first glance a uniform national curriculum is hugely attractive. Some see it striking a lethal blow at the abominable education apartheid that has wracked Pakistan from day one. By the year, a widening gap has separated beneficiaries of elite private education from those crippled by bad public schooling. So what could be better than the rich child and the poor child studying the same subjects from the same books and being judged by the same standards?

But this morally attractive idea has been hijacked, corrupted, mutilated and beaten out of shape by those near-sighted persons now holding Pakistan’s future in their hands, and who, like their boss, kowtow to the madressah establishment. Prime Minister Khan was widely criticised in 2016-17 for making huge grants to madressahs of the late Maulana Samiul Haq, self-professed father of the Taliban who was murdered by an associate in mysterious circumstances.

The SNC massively prioritises ideology over education quality and acquisition of basic skills.

As yet only SNC plans for Class I-V are public. But the huge volume of religious material they contain beats all curriculums in Pakistan’s history. A column-by-column comparison with two major madressah systems — Tanzeemul Madaris and Rabtaul Madaris — reveals a shocking fact. Ordinary schools will henceforth impose more rote learning than even these madressahs. Normal schoolteachers being under-equipped religiously, SNC calls for summoning an army of madressah-educated holy men — hafiz’s and qaris — as paid teachers inside schools. How this will affect the general ambiance and the safety of students is an open question.

The push for a uniform national curriculum idea derives from three flawed assumptions:

First: It is false that quality differences between Pakistan’s various education streams stem from pursuing different curricula. When teaching any secular subject such as geography, social studies or science, all streams have to cover the same topics. While details and emphases obviously differ, each must deal with exactly seven continents and water being H2O.

Instead, learning differentials arise because students experience very different teaching methods and are evaluated using entirely different criteria. So, for example, a local examination board will typically ask a mathematics student to name the inventor of logarithms whereas an ‘O’-level student must actually use logarithms to solve some problem. The modern world expects students to reason their way through a question, not parrot facts.

Second: It is false that a hefty dose of piety will somehow equalise students of Aitchison College and your run-of-the-mill neighbourhood school. The legendary Mahmood and Ayyaz prayed in the same suff (prayer line) and established a commonality without ending their master-slave relationship. Similarly, rich and poor schools will remain worlds apart unless equalised through school infrastructure, well-trained teachers, high quality textbooks and internet access. How the needed resources will be generated is anybody’s guess. Under the PTI, defence is the only sector seeing increases instead of cuts.

Third: It is false that school systems belonging to the modern world can be brought onto the same page as madressahs. Modern education rests squarely upon critical thinking, and success/failure is determined in relation to problem solving and worldly knowledge. Madressah education goals are important but different. They seek a more religiously observant student and a better life after death. Understandably, critical thinking is unwelcome.

While some madressahs now teach secular subjects like English, science and computers, this comes after much arm-twisting. Soon after 9/11, madressahs were spotlighted as terrorist breeding grounds. Musharraf’s government, beholden as it was to America, ordered them to teach secular subjects. Most rejected this outright but others were successfully pressurised. However, madressahs teach secular and religious subjects identically; reasoning is sparse and authoritarianism dominates.

While the new Class I-V SNC document also discusses secular subjects, much of this is pointless tinkering with the minutiae of teaching English, general knowledge, general science, mathematics and social studies. They are not accompanied by plausible plans for how the necessary intellectual or physical resources will be garnered and the plans implemented.

Still bigger changes are around the corner. The Punjab government has made teaching of the Holy Quran compulsory at the college and university level. Without passing the required examination no student will be able to get a BA, BSc, BE, ME, MA, MSc, MPhil, PhD or medical degree. Even the Zia regime did not have such blanket requirements. To get a university teaching job in the 1980s, you had to name all the wives of the Holy Prophet (PBUH) and recite some difficult religious passages such as Dua-i-Qunoot. Still, students could get degrees without that. That option is now closed.

Starkly inferior to their counterparts in Iran, India and Bangladesh, Pakistani students perform poorly in all international science and mathematics competitions. Better achievers are invariably from the elite ‘O’-/‘A’-level stream. More worrying is that most students are unable to express themselves coherently and grammatically in any language, whether Urdu or English. They have stopped reading books.

Significantly, as yet the PTI’s new education regime is mum on how it will advance its goal of closing a huge skill deficit. So poor is the present quality of technical and vocational institutes that private employers must totally retrain the graduates. That’s why private-sector industrial growth is small and entire state enterprises, such as PIA and Pakistan Steel Mills, have collapsed. Pakistan’s space programme flopped but Iran has just put a military satellite into orbit and India is well on the way to Mars.

Empowered by the 18th Amendment, Pakistan’s provinces should vigorously resist the regressive plan being thrust upon the nation by ideologues that have usurped power in Islamabad. Else Pakistan will end up as the laughing stock of South Asia, left behind even by Arab countries. Pakistan’s greatest need — and its single greatest failure — is its tragic failure to impart essential life skills to its citizens. To move ahead, the priority should be to educate rather than score political points.

The writer teaches physics in Lahore and Islamabad.

Published in Dawn, July 18th, 2020

https://www.dawn.com/news/1569679/educa ... an-exposed
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Lifelong Learning

Lifelong Learning is the continuous acquisition of new knowledge in the pursuit of success: personal, professional, and service to others.

Why is Lifelong Learning important?

Lifelong, continual learning is vital to our existence, the undeniably critical fuel for our minds to grow. The continuous process of learning sharpens critical skills that help us serve, lead, and innovate. Today’s time calls us to compete in a space of increasing globalization, rapidly evolving technologies, and the shifting employment landscape. In an ever-changing world, a single area of proficiency will not be enough; to thrive we will need to stimulate creativity and curiosity while we adapt.

But in an age of accelerating change, when even the most sophisticated skills are quickly outdated, we will find many allies in the developing world who are coming to understand that the most important skill anyone can learn is the ability to go on learning.

Mawlana Hazar Imam
Annual Meeting of the International Baccalaureate
Atlanta, GA, April 18, 2008

What does it really mean to be a Lifelong Learner?

Although it is true that knowledge is all around us and can be acquired anywhere, lifelong learning is a deliberate and voluntary act. The act of acquiring knowledge through various forms of education is not enough any longer; in addition we must engage with that knowledge, feed our curious nature with new learning, and desire to evolve into the next version of ourselves with it. A lifelong learner very intentionally, with a growth mindset seeks knowledge. When we continuously learn, we are motivated to build skills which can lead us to new opportunities in our professional, business, and personal paths—preparing us to excel in an ever changing landscape.

How do I become a Lifelong Learner?

Through the launch of this portal, we want to focus on specific areas that are of highest importance in the current times:

Growing our business
Finding a new job/role
Enhancing leadership skills
Learning pathways and skills for the future (coming soon)

As we explore this portal and lifelong learning further, here are some tips to navigate us:

- Prepare for the unexpected by staying relevant: To adapt to unforeseen changes and operate effectively in a rapidly changing technological environment, we must continuously learn new skills and sharpen current skills.
- Position our careers/businesses for opportunities in a digital and technological economy.
- Build new competencies to boost your confidence and your profile:
- For those who are fortunate to be in a career they enjoy, lifelong learning, especially if done deliberately in terms of acquisition of relevant and meaningful information can make you indispensable to your employer or facilitate faster promotions, larger salary increases, or just more transferable skills in anticipation of your next move.
- For those out of work, engaging in your own development has never been easier or more cost effective. Democratized access to knowledge puts brand new skills within reach that were previously inaccessible without physical presence or a significantly large investment of time and capital.
- Learn something new every day.
- Be on the lookout for resources to help support your learning and skill enhancement journey.
- Always have a curious mind. Explore different topics and don’t be afraid to ask for clarity. Be inquisitive—people who will help and support you in your journey.

https://the.ismaili/usa/lifelong-learning-0
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Accelerated Learning: Learn Faster and Remember More

You can train your brain to retain knowledge and insight better by understanding how you learn. Once you understand the keys to learning, everything changes—from the way you ask questions to the way you consume information. People will think you have a superpower.
Image
Learning is the act of incorporating new facts, concepts, and abilities into our brains. We start learning in the womb and we never stop; we are always developing new competencies. Every new bit of knowledge we acquire builds on what we already know and gives us a fuller, richer picture of the world. And the more developed our understanding of the world is, the easier it is for us to adapt and pivot when our circumstances change.

We know from biology that organisms that can adapt to their constantly changing environment survive and thrive. Those that can’t eventually go extinct. The same is true for us in our life and work. We all know the person at work that hasn’t adapted to the changing times. Their unwillingness to stretch themselves and learn something new makes it seem like they are moving backwards.

People can end up stuck with a static amount of knowledge because we don’t just passively absorb new ideas and information. Learning something new requires active engagement. At FS, we see learning as part of our daily job. We get better to help you get better, and we give you the tools to learn.

The greatest enemy of learning is what you think you know. When you think you know something, learning something new means you might have to change your mind, so it’s easy to think there’s no room for new ideas. But not wanting to change your mind will keep you stuck in the same place. Overcoming our egos can be one of the big challenges of learning. Therefore, being willing to admit when you’re wrong and adjust your thinking is the thing that will help you learn the most. The first step to learning is recognizing your ignorance and deciding to do something about it.

At FS, we believe that thriving in life means relentless lifelong learning. Things change quickly, and you can’t coast on what you already know. The best approach you can take towards learning is one that helps you go to bed a little bit smarter each day.

WHY WE’RE BAD AT LEARNING

One reason why we’re bad at learning is that we bring a lot of baggage to it—baggage we often pick up early in life and then struggle to let go of later. If you can let go of assumptions about what you should do, you can learn much more effectively. One major piece of baggage we accrue is the belief that if we’re not visibly active, we’re not learning. This is incorrect. Learning requires time to reflect. It requires discussing what you’ve learned and letting your mind wander. You need to let go of trying to look smart, and focus instead on trying to be smart.

We sometimes struggle with learning because we expect it to be easy. We’d all like to find a silver bullet: a quick, easy shortcut to picking up whatever we want and never forgetting it. The internet teems with blog posts promising you can learn a language in a week or to code in a month or how to play the violin overnight. This is all bull. Learning isn’t just knowing something for a day. It’s deep wisdom that allows you to create, innovate, and push boundaries.

Another reason we’re bad at learning is that the modern world erodes our attention spans by training us to be in a constant state of distraction. It might feel good in the moment to jump to the latest ping on your phone, but learning requires deep focus. When you’re in a distracted state, new information can’t fix itself in your mind, and you end up with gaps in your understanding. Focusing is an art—through experimentation and creativity, you can build systems that let you give your full attention to whatever you’re learning.

When we talk about accelerated learning at FS, we don’t mean you can overcome the need for hard work. We mean it’s possible to find ways of learning that give you tangible results, instead of using ones that waste time or get you nowhere. But no matter what, you still need to do the work. Learning should be difficult, but that doesn’t mean it can’t be fun.

In fact, you enhance your skills the most when you stretch yourself to the limits of your abilities. Pushing yourself to the point that feels challenging yet doable is the foundation of deliberate practice, the technique elite people in every field use to grow their expertise.

Outside of your intellectual comfort zone is where you experience the greatest learning. For this reason, having a lot of helpful guidance early on can be counterproductive if it eliminates the all-important struggle to master new material.

You might fail if you move beyond what you find comfortable. You might apply new knowledge incorrectly or misunderstand the key points of an argument in a new field. This is painful. It also offers the opportunity to learn from your failures. We all hate looking dumb, but you can’t let failure dissuade you from trying to expand your circle of competence.

TWO MAIN SOURCES OF LEARNING

There are two main places we can learn from: our own experience and history, or the experience of others. Exploring the distinction between the two can show us the best ways to learn from each of them.

Image

Learning from history

You can learn from the experiences of others by studying history and applying its lessons to the present. History tends to repeat itself, so the dilemmas and decisions you face today often have historical antecedents. Studying the past helps us know how to shape the future. History is one of our biggest sources of fundamental knowledge.

Any time you learn from the past, remember that knowledge collapses over time. Much of what you know today will one day prove false, just like much of what people knew a century, fifty years, or even a decade ago has since proved false. Facts have a half-life, one that can be especially short in fields like medicine and the social sciences.

It is a mistake to think we have reached the endpoint of human knowledge, or that anything you learn now will be true forever. When you learn from history, you draw from lessons shaped by the perspective of the person who captured what happened. Thus, historical knowledge is something to continuously update as you learn both from what happened and how you choose to look at it.

Learning from experience and reflection

In addition to reading, direct experience is the other main way we learn.

Double loop learning is a way of updating your opinions and ideas in response to new evidence and experience. When you keep repeating the same mistakes, you’re using single loop learning. It doesn’t get you far. When you reflect on experiences, collect new data, and make an active effort to seek out feedback, you’re using double loop learning.

Reflection allows you to distill experience into learning. Don’t just “do,” think about what you’re doing and what you’ve done. High performers make adjustments based on both their successes and failures.

LEARN BETTER

We recommend the following two proven techniques for improving your learning.

The Feynman Technique

If you want to supercharge your learning, the single most effective technique we’ve uncovered for absorbing new concepts comes from the famed Nobel Prize-winning physicist Richard Feynman. The Feynman Technique ensures you understand what you learn. It includes the following four steps:

- Choose a concept you wish to learn about.
- Pretend you are teaching it to a child—a sixth-grader, specifically. Write your explanation down or say it out loud.
- Identify any gaps in your understanding that might show up when you try to simplify the concept; go back to the source material to find the information you need.
- Review and simplify your explanation again.

It works because writing out a concept in language a child would understand forces you to understand it at a deeper level. Sometimes we use jargon and complicated language to hide what we don’t understand. The Feynman Technique lays bare the true extent of our knowledge.

Similarly, asking better questions is a route to faster learning. The most mundane questions—the ones a sixth-grader might ask—can sometimes teach us the most because they require an explanation that digs into the details.

How do you know if you’ve truly learned a new concept? Feynman proposed a simple alternate test: try to rephrase it in your own language without using its actual name. For instance, describe what enables a dog to run without using the word “energy.”

Spaced repetition

Rote memorization doesn’t work. Period. The key to effective learning is spaced repetition, a technique that works with the way your brain naturally retains information, not against it.

Spaced repetition involves revising information at increasing intervals. This reflects and combats the fact that once you learn something you gradually forget it, with the forgetting happening fast at first, then leveling off. Using spaced repetition, you remind yourself of information often at first, then less often.

Memory mastery comes from repeated exposure to new material. In order to learn something, you need to retrieve it from memory again and again. Retrieval makes information stick even better than re-exposing yourself to the original material.

ARTICLES ON ACCELERATED LEARNING

- The more we learn about the world, the more we can learn about ourselves, according to Nietzsche.
- “Knowledge makes everything simpler”: advice for learning from executive and technologist John Maeda, including why you should teach yourself the basics and why metaphors are powerful for transferring information across contexts.
- Charles Darwin may not have had an unusually high IQ, but he was able to outpace other thinkers by learning how to balance out his deficiencies.
- Ken Iverson, the former CEO of Nucor Steel, believed MBAs should focus on teaching students how to understand and lead people above all else.
- Harvard biologist/psychologist Steven Pinker’s career is a testament to the benefits of multidisciplinary thinking. Here’s what he believes students should learn as part of a thorough education.
- In a charming letter to his son Hans, Albert Einstein said the best way to learn is to enjoy something to the point where you don’t even notice the time passing.
- Even the most skilled teachers struggle to overcome the reality that we forget most information shortly after being exposed to it. Effective learning requires building your own understanding, with the guidance of an expert teacher.
- Chess and martial arts genius Josh Waitzkin teaches us that the art of learning requires first mastering the fundamentals by breaking a skill down into blocks.
- “Mozart’s Brain and the Fighter Pilot” shows us that we get smarter by exercising our cognitive powers in the same way that we get stronger by exercising our muscles.
- Never learning to paint via the conventional route helped Vincent van Gogh approach his work in a unique way, noticing details a trained artist might not have.

THE BEST BOOKS ON LEARNING

- The Art of Learning: An Inner Journey to Optimal Performance, Joshua Waitzkin
- The Laws of Simplicity, John Maeda
- Make It Stick: The Science of Successful Learning, Peter C. Brown, Henry L. Roediger III, and Mark A. McDaniel
- Mindshift: Break Through Obstacles to Learning and Discover Your Hidden Potential, Barbara Oakley
- Ultralearning: Master Hard Skills, Outsmart the Competition, and Accelerate Your Career, Scott H. Young
- Mastery, Robert Greene
- Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School, John Medina
- Deep Work: Rules for Focuses Success in a Distracted World, Cal Newport
Mindset: The New Psychology of Success, Carol S. Dweck
- The Talent Code, Daniel Coyle

CONCLUSION

Learning isn’t something you do at the behest of someone else. You’re responsible for it. According to the prolific author Louis L’Amour, all education is self-education. If you don’t take charge of your learning, no one else will. Maya Angelou and George Washington took the same view. It’s up to you to build the habit of lifelong learning.

Relevant links at:

https://fs.blog/learning/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Observer Effect: Seeing Is Changing
READING TIME: 5 MINUTES

The act of looking at something changes it – an effect that holds true for people, animals, even atoms. Here’s how the observer effect distorts our world and how we can get a more accurate picture.

***

We often forget to factor in the distortion of observation when we evaluate someone’s behavior. We see what they are doing as representative of their whole life. But the truth is, we all change how we act when we expect to be seen. Are you ever on your best behavior when you’re alone in your house? To get better at understanding other people, we need to consider the observer effect: observing things changes them, and some phenomena only exist when observed.

The observer effect is not universal. The moon continues to orbit whether we have a telescope pointed at it or not. But both things and people can change under observation. So, before you judge someone’s behavior, it’s worth asking if they are changing because you are looking at them, or if their behavior is natural. People are invariably affected by observation. Being watched makes us act differently.

“I believe in evidence. I believe in observation, measurement, and reasoning, confirmed by independent observers.”
— Isaac Asimov

The observer effect in science

The observer effect pops up in many scientific fields.

In physics, Erwin Schrödinger’s famous cat highlights the power of observation. In his best-known thought experiment, Schrödinger asked us to imagine a cat placed in a box with a radioactive atom that might or might not kill it in an hour. Until the box opens, the cat exists in a state of superposition (when half of two states each occur at the same time)—that is, the cat is both alive and dead. Only by observing it does the cat shift permanently to one of the two states. The observation removes the cat from a state of superposition and commits it to just one.

In biology, when researchers want to observe animals in their natural habitat, it is paramount that they find a way to do so without disturbing those animals. Otherwise, the behavior they see is unlikely to be natural, because most animals (including humans) change their behavior when they are being observed. For instance, Dr. Cristian Damsa and his colleagues concluded in their paper “Heisenberg in the ER” that being observed makes psychiatric patients a third less likely to require sedation. Doctors and nurses wash their hands more when they know their hygiene is being tracked. And other studies have shown that zoo animals only exhibit certain behaviors in the presence of visitors, such as being hypervigilant of their presence and repeatedly looking at them.

In general, we change our behavior when we expect to be seen. Philosopher Jeremy Bentham knew this when he designed the panopticon prison in the eighteenth century, building upon an idea by his brother Samuel. The prison was constructed so that its cells circled a central watchtower so inmates could never tell if they were being watched or not. Bentham expected this would lead to better behavior, without the need for many staff. It never caught on as an actual design for prisons, but the modern prevalence of CCTV is often compared to the Panopticon. We never know when we’re being watched, so we act as if it’s all the time.

The observer effect, however, is twofold. Observing changes what occurs, but observing also changes our perceptions of what occurs. Let’s take a look at that next.

“How much does one imagine, how much observe? One can no more separate those functions than divide light from air, or wetness from water.”
— Elspeth Huxley

Observer bias

The effects of observation get more complex when we consider how each of us filters what we see through our own biases, assumptions, preconceptions, and other distortions. There’s a reason, after all, why double-blinding (ensuring both tester and subject does not receive any information that may influence their behavior) is the gold-standard in research involving living things. Observer bias occurs when we alter what we see, either by only noticing what we expect or by behaving in ways that have influence on what occurs. Without intending to do so, researchers may encourage certain results, leading to changes in ultimate outcomes.

A researcher falling prey to the observer bias is more likely to make erroneous interpretations, leading to inaccurate results. For instance, in a trial for an anti-anxiety drug where researchers know which subjects receive a placebo and which receive actual drugs, they may report that the latter group seems calmer because that’s what they expect.

The truth is, we often see what we expect to see. Our biases lead us to factor in irrelevant information when evaluating the actions of others. We also bring our past into the present and let that color our perceptions as well—so, for example, if someone has really hurt you before, you are less likely to see anything good in what they do.

The actor-observer bias

Another factor in the observer effect, and one we all fall victim to, is our tendency to attribute the behavior of others to innate personality traits. Yet we tend to attribute our own behavior to external circumstances. This is known as the actor-observer bias.

For example, a student who gets a poor grade on a test claims they were tired that day or the wording on the test was unclear. Conversely, when that same student observes a peer who performed badly on a test on which they performed well, the student judges their peer as incompetent or ill-prepared. If someone is late to a meeting with a friend, they rush in apologizing for the bad traffic. But if the friend is late, they label them as inconsiderate. When we see a friend having an awesome time in a social media post, we assume their life is fun all of the time. When we post about ourselves having an awesome time, we see it as an anomaly in an otherwise non-awesome life.

We have different levels of knowledge about ourselves and others. Because observation focuses on what is displayed, not what preceded or motivated it, we see the full context for our own behavior but only the final outcome for other people. We need to take the time to learn the context of other’s lives before we pass judgment on their actions.

Conclusion

We can use the observer effect to our benefit. If we want to change a behavior, finding some way to ensure someone else observes it can be effective. For instance, going to the gym with a friend means they know if we don’t go, making it more likely that we stick with it. Tweeting about our progress on a project can help keep us accountable. Even installing software on our laptop that tracks how often we check social media can reduce our usage.

But if we want to get an accurate view of reality, it is important we consider how observing it may distort the results. The value of knowing about the observer effect in everyday life is that it can help us factor in the difference that observation makes. If we want to gain an accurate picture of the world, it pays to consider how we take that picture. For instance, you cannot assume that an employee’s behavior in a meeting translates to their work, or that the way your kids act at home is the same as in the playground. We all act differently when we know we are being watched.

https://fs.blog/2020/08/observer-effect/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

International Literacy Day 2020

Video:

https://www.youtube.com/watch?v=Ko26TKjmOhc

Educators and Education Professionals from across the US share why they teach and got into the education field.
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

A Primer on Algorithms and Bias

The growing influence of algorithms on our lives means we owe it to ourselves to better understand what they are and how they work. Understanding how the data we use to inform algorithms influences the results they give can help us avoid biases and make better decisions.

***

Algorithms are everywhere: driving our cars, designing our social media feeds, dictating which mixer we end up buying on Amazon, diagnosing diseases, and much more.

Two recent books explore algorithms and the data behind them. In Hello World: Being Human in the Age of Algorithms, mathematician Hannah Fry shows us the potential and the limitations of algorithms. And Invisible Women: Data Bias in a World Designed for Men by writer, broadcaster, and feminist activist Caroline Criado Perez demonstrates how we need to be much more conscientious of the quality of the data we feed into them.

Humans or algorithms?

First, what is an algorithm? Explanations of algorithms can be complex. Fry explains that at their core, they are defined as step-by-step procedures for solving a problem or achieving a particular end. We tend to use the term to refer to mathematical operations that crunch data to make decisions.

When it comes to decision-making, we don’t necessarily have to choose between doing it ourselves and relying wholly on algorithms. The best outcome may be a thoughtful combination of the two.

We all know that in certain contexts, humans are not the best decision-makers. For example, when we are tired, or when we already have a desired outcome in mind, we may ignore relevant information. In Thinking, Fast and Slow, Daniel Kahneman gave multiple examples from his research with Amos Tversky that demonstrated we are heavily influenced by cognitive biases such as availability and anchoring when making certain types of decisions. It’s natural, then, that we would want to employ algorithms that aren’t vulnerable to the same tendencies. In fact, their main appeal for use in decision-making is that they can override our irrationalities.

Algorithms, however, aren’t without their flaws. One of the obvious ones is that because algorithms are written by humans, we often code our biases right into them. Criado Perez offers many examples of algorithmic bias.

For example, an online platform designed to help companies find computer programmers looks through activity such as sharing and developing code in online communities, as well as visiting Japanese manga (comics) sites. People visiting certain sites with frequency received higher scores, thus making them more visible to recruiters.

However, Criado Perez presents the analysis of this recruiting algorithm by Cathy O’Neil, scientist and author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, who points out that “women, who do 75% of the world’s unpaid care work, may not have the spare leisure time to spend hours chatting about manga online . . . and if, like most of techdom, that manga site is dominated by males and has a sexist tone, a good number of women in the industry will probably avoid it.”

Criado Perez postulates that the authors of the recruiting algorithm didn’t intend to encode a bias that discriminates against women. But, she says, “if you aren’t aware of how those biases operate, if you aren’t collecting data and taking a little time to produce evidence-based processes, you will continue to blindly perpetuate old injustices.”

Fry also covers algorithmic bias and asserts that “wherever you look, in whatever sphere you examine, if you delve deep enough into any system at all, you’ll find some kind of bias.” We aren’t perfect—and we shouldn’t expect our algorithms to be perfect, either.

In order to have a conversation about the value of an algorithm versus a human in any decision-making context, we need to understand, as Fry explains, that “algorithms require a clear, unambiguous idea of exactly what we want them to achieve and a solid understanding of the human failings they are replacing.”

Garbage in, garbage out

No algorithm is going to be successful if the data it uses is junk. And there’s a lot of junk data in the world. Far from being a new problem, Criado Perez argues that “most of recorded human history is one big data gap.” And that has a serious negative impact on the value we are getting from our algorithms.

Criado Perez explains the situation this way: We live in “a world [that is] increasingly reliant on and in thrall to data. Big data. Which in turn is panned for Big Truths by Big Algorithms, using Big Computers. But when your data is corrupted by big silences, the truths you get are half-truths, at best.”

A common human bias is one regarding the universality of our own experience. We tend to assume that what is true for us is generally true across the population. We have a hard enough time considering how things may be different for our neighbors, let alone for other genders or races. It becomes a serious problem when we gather data about one subset of the population and mistakenly assume that it represents all of the population.

For example, Criado Perez examines the data gap in relation to incorrect information being used to inform decisions about safety and women’s bodies. From personal protective equipment like bulletproof vests that don’t fit properly and thus increase the chances of the women wearing them getting killed to levels of exposure to toxins that are unsafe for women’s bodies, she makes the case that without representative data, we can’t get good outputs from our algorithms. She writes that “we continue to rely on data from studies done on men as if they apply to women. Specifically, Caucasian men aged twenty-five to thirty, who weigh 70 kg. This is ‘Reference Man’ and his superpower is being able to represent humanity as whole. Of course, he does not.” Her book contains a wide variety of disciplines and situations where the gender gap in data leads to increased negative outcomes for women.

The limits of what we can do

Although there is a lot we can do better when it comes to designing algorithms and collecting the data sets that feed them, it’s also important to consider their limits.

We need to accept that algorithms can’t solve all problems, and there are limits to their functionality. In Hello World, Fry devotes a chapter to the use of algorithms in justice. Specifically, algorithms designed to provide information to judges about the likelihood of a defendant committing further crimes. Our first impulse is to say, “Let’s not rely on bias here. Let’s not have someone’s skin color or gender be a key factor for the algorithm.” After all, we can employ that kind of bias just fine ourselves. But simply writing bias out of an algorithm is not as easy as wishing it so. Fry explains that “unless the fraction of people who commit crimes is the same in every group of defendants, it is mathematically impossible to create a test which is equally accurate at predicting across the board and makes false positive and false negative mistakes at the same rate for every group of defendants.”

Fry comes back to such limits frequently throughout her book, exploring them in various disciplines. She demonstrates to the reader that “there are boundaries to the reach of algorithms. Limits to what can be quantified.” Perhaps a better understanding of those limits is needed to inform our discussions of where we want to use algorithms.

There are, however, other limits that we can do something about. Both authors make the case for more education about algorithms and their input data. Lack of understanding shouldn’t hold us back. Algorithms that have a significant impact on our lives specifically need to be open to scrutiny and analysis. If an algorithm is going to put you in jail or impact your ability to get a mortgage, then you ought to be able to have access to it.

Most algorithm writers and the companies they work for wave the “proprietary” flag and refuse to open themselves up to public scrutiny. Many algorithms are a black box—we don’t actually know how they reach the conclusions they do. But Fry says that shouldn’t deter us. Pursuing laws (such as the data access and protection rights being instituted in the European Union) and structures (such as an algorithm-evaluating body playing a role similar to the one the U.S. Food and Drug Administration plays in evaluating whether pharmaceuticals can be made available to the U.S. market) will help us decide as a society what we want and need our algorithms to do.

Where do we go from here?

Algorithms aren’t going away, so it’s best to acquire the knowledge needed to figure out how they can help us create the world we want.

Fry suggests that one way to approach algorithms is to “imagine that we designed them to support humans in their decisions, rather than instruct them.” She envisions a world where “the algorithm and the human work together in partnership, exploiting each other’s strengths and embracing each other’s flaws.”

Part of getting to a world where algorithms provide great benefit is to remember how diverse our world really is and make sure we get data that reflects the realities of that diversity. We can either actively change the algorithm, or we change the data set. And if we do the latter, we need to make sure we aren’t feeding our algorithms data that, for example, excludes half the population. As Criado Perez writes, “when we exclude half of humanity from the production of knowledge, we lose out on potentially transformative insights.”

Given how complex the world of algorithms is, we need all the amazing insights we can get. Algorithms themselves perhaps offer the best hope, because they have the inherent flexibility to improve as we do.

Fry gives this explanation: “There’s nothing inherent in [these] algorithms that means they have to repeat the biases of the past. It all comes down to the data you give them. We can choose to be ‘crass empiricists’ (as Richard Berk put it ) and follow the numbers that are already there, or we can decide that the status quo is unfair and tweak the numbers accordingly.”

We can get excited about the possibilities that algorithms offer us and use them to create a world that is better for everyone.

https://fs.blog/2020/09/algorithms-and-bias/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Spiral of Silence
READING TIME: 5 MINUTES

Our desire to fit in with others means we don’t always say what we think. We only express opinions that seem safe. Here’s how the spiral of silence works and how we can discover what people really think.

***

Be honest: How often do you feel as if you’re really able to express your true opinions without fearing judgment? How often do you bite your tongue because you know you hold an unpopular view? How often do you avoid voicing any opinion at all for fear of having misjudged the situation?

Even in societies with robust free speech protections, most people don’t often say what they think. Instead they take pains to weigh up the situation and adjust their views accordingly. This comes down to the “spiral of silence,” a human communication theory developed by German researcher Elisabeth Noelle-Neumann in the 1960s and ’70s. The theory explains how societies form collective opinions and how we make decisions surrounding loaded topics.

Let’s take a look at how the spiral of silence works and how understanding it can give us a more realistic picture of the world.

***

How the spiral of silence works

According to Noelle-Neumann’s theory, our willingness to express an opinion is a direct result of how popular or unpopular we perceive it to be. If we think an opinion is unpopular, we will avoid expressing it. If we think it is popular, we will make a point of showing we think the same as others.

Controversy is also a factor—we may be willing to express an unpopular uncontroversial opinion but not an unpopular controversial one. We perform a complex dance whenever we share views on anything morally loaded.

Our perception of how “safe” it is to voice a particular view comes from the clues we pick up, consciously or not, about what everyone else believes. We make an internal calculation based on signs like what the mainstream media reports, what we overhear coworkers discussing on coffee breaks, what our high school friends post on Facebook, or prior responses to things we’ve said.

We also weigh up the particular context, based on factors like how anonymous we feel or whether our statements might be recorded.

As social animals, we have good reason to be aware of whether voicing an opinion might be a bad idea. Cohesive groups tend to have similar views. Anyone who expresses an unpopular opinion risks social exclusion or even ostracism within a particular context or in general. This may be because there are concrete consequences, such as losing a job or even legal penalties. Or there may be less official social consequences, like people being less friendly or willing to associate with you. Those with unpopular views may suppress them to avoid social isolation.

Avoiding social isolation is an important instinct. From an evolutionary biology perspective, remaining part of a group is important for survival, hence the need to at least appear to share the same views as anyone else. The only time someone will feel safe to voice a divergent opinion is if they think the group will share it or be accepting of divergence, or if they view the consequences of rejection as low. But biology doesn’t just dictate how individuals behave—it ends up shaping communities. It’s almost impossible for us to step outside of that need for acceptance.

A feedback loop pushes minority opinions towards less and less visibility—hence why Noelle-Neumann used the word “spiral.” Each time someone voices a majority opinion, they reinforce the sense that it is safe to do so. Each time someone receives a negative response for voicing a minority opinion, it signals to anyone sharing their view to avoid expressing it.

***

An example of the spiral of silence

A 2014 Pew Research survey of 1,801 American adults examined the prevalence of the spiral of silence on social media. Researchers asked people about their opinions on one public issue: Edward Snowden’s 2013 revelations of US government surveillance of citizens’ phones and emails. They selected this issue because, while controversial, prior surveys suggested a roughly even split in public opinion surrounding whether the leaks were justified and whether such surveillance was reasonable.

Asking respondents about their willingness to share their opinions in different contexts highlighted how the spiral of silence plays out. 86% of respondents were willing to discuss the issue in person, but only about half as many were willing to post about it on social media. Of the 14% who would not consider discussing the Snowden leaks in person, almost none (0.3%) were willing to turn to social media instead.

Both in person and online, respondents reported far greater willingness to share their views with people they knew agreed with them—three times as likely in the workplace and twice as likely in a Facebook discussion.

***

The implications of the spiral of silence

The end result of the spiral of silence is a point where no one publicly voices a minority opinion, regardless of how many people believe it. The first implication of this is that the picture we have of what most people believe is not always accurate. Many people nurse opinions they would never articulate to their friends, coworkers, families, or social media followings.

A second implication is that the possibility of discord makes us less likely to voice an opinion at all, assuming we are not trying to drum up conflict. In the aforementioned Pew survey, people were more comfortable discussing a controversial story in person than online. An opinion voiced online has a much larger potential audience than one voiced face to face, and it’s harder to know exactly who will see it. Both of these factors increase the risk of someone disagreeing.

If we want to gauge what people think about something, we need to remove the possibility of negative consequences. For example, imagine a manager who often sets overly tight deadlines, causing immense stress to their team. Everyone knows this is a problem and discusses it among themselves, recognizing that more realistic deadlines would be motivating, and unrealistic ones are just demoralizing. However, no one wants to say anything because they’ve heard the manager say that people who can’t handle pressure don’t belong in that job. If the manager asks for feedback about their leadership style, they’re not going to hear what they need to hear if they know who it comes from.

A third implication is that what seems like a sudden change in mainstream opinions can in fact be the result of a shift in what is acceptable to voice, not in what people actually think. A prominent public figure getting away with saying something controversial may make others feel safe to do the same. A change in legislation may make people comfortable saying what they already thought.

For instance, if recreational marijuana use is legalized where someone lives, they might freely remark to a coworker that they consume it and consider it harmless. Even if that was true before the legislation change, saying so would have been too fraught, so they might have lied or avoided the topic. The result is that mainstream opinions can appear to change a great deal in a short time.

A fourth implication is that highly vocal holders of a minority opinion can end up having a disproportionate influence on public discourse. This is especially true if that minority is within a group that already has a lot of power.

While this was less the case during Noelle-Neumann’s time, the internet makes it possible for a vocal minority to make their opinions seem far more prevalent than they actually are—and therefore more acceptable. Indeed, the most extreme views on any spectrum can end up seeming most normal online because people with a moderate take have less of an incentive to make themselves heard.

In anonymous environments, the spiral of silence can end up reversing itself, making the most fringe views the loudest.

https://fs.blog/2020/09/spiral-of-silence/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Teachers: Shaping the future of our Jamat

Video:

https://www.youtube.com/watch?v=vEzxb9ONG9I
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Video Quote: MHI on Knowledge Society
Image
Video:

https://www.youtube.com/watch?v=j7Nl7rdI7W4
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Time To Think: A Glimpse into the Past: A Window to the Future with Dr Shainool Jiwa

Our history offers a rich repository of our beliefs and values, and how we have lived by them through the centuries. This talk explores select examples from the Dawr al-Satr (765-909 CE) and the Fatimid period (909-1171 CE) of our history, to illustrate how the Imams and the leadership at the time dealt with challenging circumstances of their age, using them as a springboard for laying stronger foundations for the future of the Jamat across various regions of the world.

Video:

https://tv.ismaili/watch/time-to-think- ... inool-jiwa
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

How Julia Child Used First Principles Thinking

There’s a big difference between knowing how to follow a recipe and knowing how to cook. If you can master the first principles within a domain, you can see much further than those who are just following recipes. That’s what Julia Child, “The French Chef”, did throughout her career.

Following a recipe might get you the results you want, but it doesn’t teach you anything about how cooking works at the foundational level. Or what to do when something goes wrong. Or how to come up with your own recipes when you open the fridge on a Wednesday night and realize you forgot to go grocery shopping. Or how to adapt recipes for your own dietary needs.

Adhering to recipes will only get you so far, and it certainly won’t result in you coming up with anything new or creative.

People who know how to cook understand the basic principles that make food taste, look, and smell good. They have confidence in troubleshooting and solving problems as they go—or adjusting to unexpected outcomes. They can glance at an almost barren kitchen and devise something delicious. They know how to adapt to a guest with a gluten allergy or a child who doesn’t like green food. Sure, they might consult a recipe when it makes sense to do so. But they’re not dependent on it, and they can change it up based on their particular circumstances.

There’s a reason many cooking competition shows feature a segment where contestants need to design their own recipe from a limited assortment of ingredients. Effective improvisation shows the judges that someone can actually cook, not just follow recipes.

We can draw a strong parallel from cooking to thinking. If you want to learn how to think for yourself, you can’t just follow what someone else came up with. You need to understand first principles if you want to be able to solve complex problems or think in a unique, creative fashion. First principles are the building blocks of knowledge, the foundational understanding acquired from breaking something down into its most essential concepts.

One person who exemplifies first principles thinking is Julia Child, an American educator who charmed audiences with her classes, books, and TV shows. First principles thinking enabled Julia to both master her own struggles with cooking and then teach the world to do the same. In Something from the Oven, Laura Shapiro tells the charming story of how she did it. Here’s what we can learn about better thinking from the “French Chef.”

More...

https://fs.blog/2020/11/how-julia-child ... -thinking/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Quote from MHI's speech on modern education:

But education comes in many forms, and has been used for many purposes. An education for success in the modern world must be enabling and it must be outward looking. It must not only teach the time tested skills of reading, writing, and mathematics, which remain important, and must not only build on Central Asia’s fine tradition of encouraging students to master more than one language. Today’s students need to learn to use computers. The ability to use communication and information technology is now a critical part of the learning, as well as an essential qualification for eventual application in the workplace.

But even this is not enough. There are two more dimensions of education for the modern world about which I would like to make a few remarks. The first relates to inquisitiveness, critical thinking, and problem solving. What students know is no longer the most important measure of the quality of education. The true test is the ability to engage with what they do not know, and to work out a solution. The second dimension involves the ability to reach conclusions that constitutes the basis for informed judgements. The ability to make judgements that are grounded in solid information, and employ careful analysis should be one of the most important goals for any educational endeavour. As students develop this capacity, they can begin to grapple with the most important and difficult step: to learn to place such judgements in an ethical framework. Therein lies the formation of the kind of social consciousness that our world so desperately needs.

I hasten to add that these capacities cannot be developed quickly, nor can they be mastered at the high school level. But a beginning must be made, and starting this process should be part of the mission of this institution.

https://www.akdn.org/speech/his-highnes ... han-school
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Why Computers Will Never Write Good Novels

The power of narrative flows only from the human brain.


https://nautil.us/issue/95/escape/why-c ... b00bf1f6eb

You’ve been hoaxed.

The hoax seems harmless enough. A few thousand AI researchers have claimed that computers can read and write literature. They’ve alleged that algorithms can unearth the secret formulas of fiction and film. That Bayesian software can map the plots of memoirs and comic books. That digital brains can pen primitive lyrics1 and short stories—wooden and weird, to be sure, yet evidence that computers are capable of more.

But the hoax is not harmless. If it were possible to build a digital novelist or poetry analyst, then computers would be far more powerful than they are now. They would in fact be the most powerful beings in the history of Earth. Their power would be the power of literature, which although it seems now, in today’s glittering silicon age, to be a rather unimpressive old thing, springs from the same neural root that enables human brains to create, to imagine, to dream up tomorrows. It was the literary fictions of H.G. Wells that sparked Robert Goddard to devise the liquid-fueled rocket, launching the space epoch; and it was poets and playwrights—Homer in The Iliad, Karel Čapek in Rossumovi Univerzální Roboti—who first hatched the notion of a self-propelled metal robot, ushering in the wonder-horror of our modern world of automata.

At the bottom of literature’s strange and branching multiplicity is an engine of causal reasoning.

If computers could do literature, they could invent like Wells and Homer, taking over from sci-fi authors to engineer the next utopia-dystopia. And right now, you probably suspect that computers are on the verge of doing just so: Not too far in the future, maybe in my lifetime even, we’ll have a computer that creates, that imagines, that dreams. You think that because you’ve been duped by the hoax. The hoax, after all, is everywhere: college classrooms, public libraries, quiz games, IBM, Stanford, Oxford, Hollywood. It’s become such a pop-culture truism that Wired enlisted an algorithm, SciFiQ, to craft “the perfect piece of science fiction.”2

Yet despite all this gaudy credentialing, the hoax is a complete cheat, a total scam, a fiction of the grossest kind. Computers can’t grasp the most lucid haiku. Nor can they pen the clumsiest fairytale. Computers cannot read or write literature at all. And they never, never will.

I can prove it to you.

Proof and more...

https://nautil.us/issue/95/escape/why-c ... b00bf1f6eb
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Video Quote: On Contributions of African and Asian Intellectuals Towards Global Knowledge
Image
Video:

https://www.youtube.com/watch?v=ufrgO77pjpQ
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The secret to excellent memory

Hi Karim,

Our education system has only taught us WHAT to learn — Math, Science, Geography, Spanish, etc...

But it's never taught us HOW to learn.

HOW to absorb more information in less time.

HOW to successfully retain the knowledge we learn.

HOW to effectively recall information when we need it most.

Which is why the brilliant new Mindvalley Masterclass with Jim Kwik on How To Be A Super Learner is so important.

Because learning HOW to learn may just be the most important skill you will ever master.

>> Register here for free
https://www.mindvalley.com/superbrain/m ... d=13577274

You’ll find out:

- 10 powerful, yet easily applicable hacks that will quickly unlock the super learner within.
- The big lie we were all told about our brains. Contrary to conventional wisdom, your memory is not fixed. And when you realize just how much control you have, it will shift what you think you're truly capable of.
- A 2,500-year-old ancient Greek memorization technique that will instantly improve your memory that can be used in virtually any situation – from delivering a speech without notes to remembering your entire groceries list without writing it down.
- How the 2 most costly words in your life are robbing you of your peace of mind, your performance, your productivity, and even your prosperity.

And much more...
>> Sign up for How To Be A Super Learner right here.

https://www.mindvalley.com/superbrain/m ... d=13577274

Yours for lifelong learning,

Ocean Robbins
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Our Most Effective Weapon Is Imagination

Why science changes everything.


Excerpt:

Culture, the awareness of one’s own deepest roots, is a kind of superpower that guarantees a good chance of survival even in the most extreme conditions. Imagine for a moment two primitive social groups, two small clans of Neanderthals that live isolated from each other in the frozen Europe of that era. Now suppose that by chance one of these groups develops their own distinct vision of the world, cultivated and perpetuated over generations through rituals and ceremonies, and perhaps visually represented in cave paintings, while the other group fails to do so, evolving that is without developing any sophisticated form of culture. Now let’s suppose that a disaster strikes both groups: a flood or a period of cold even more extreme than usual, or an attack by ferocious beasts that leaves only a solitary living survivor. This last man standing, in the case of both groups, will have to overcome a thousand dangers, face every kind of privation, perhaps migrate to other zones and even evade the hostility of humans. Which of the two will show the most resilience? Who will have the best chance of surviving?

A creation story, a narrative of origins, gives you the strength to get up when you are knocked down, the motivation to endure the most desperate circumstances. Clinging to the blanket that gives us protection and an identity, we find the strength to resist and to carry on. To be able to place ourselves and the others in our clan in a long chain of events that began in a distant past gives us the opportunity to imagine a future. Whoever has this knowledge can place in a wider framework the terrible vicissitudes of the present, giving sense to suffering, helping us to overcome even the most terrible tragedies.

And that is why we are still here, thousands of generations later, to give value to art, philosophy, science. Because we are the inheritors of this natural selection. Those individuals and groups most equipped to develop a symbolic universe have enjoyed a significant evolutionary advantage. And we are their descendants.

More...

https://nautil.us/issue/99/universality ... b00bf1f6eb
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Stories We Love Make Us Who We Are

Before there were books, there were stories. At first the stories weren’t written down. Sometimes they were even sung. Children were born, and before they could speak, their parents sang them songs, a song about an egg that fell off a wall, perhaps, or about a boy and a girl who went up a hill and fell down it. As the children grew older, they asked for stories almost as often as they asked for food.

The children fell in love with these stories and wanted to hear them over and over again. Then they grew older and found those stories in books. And other stories that they had never heard before, about a girl who fell down a rabbit hole, or a silly old bear and an easily scared piglet and a gloomy donkey, or a phantom tollbooth, or a place where wild things were. The act of falling in love with stories awakened something in the children that would nourish them all their lives: their imagination.

The children made up play stories every day, they stormed castles and conquered nations and sailed the oceans blue, and at night their dreams were full of dragons. But they went on growing up and slowly the stories fell away from them, the stories were packed away in boxes in the attic, and it became harder for the former children to tell and receive stories, harder for them, sadly, to fall in love.

I believe that the books and stories we fall in love with make us who we are, or, not to claim too much, the beloved tale becomes a part of the way in which we understand things and make judgments and choices in our daily lives. A book may cease to speak to us as we grow older, and our feeling for it will fade. Or we may suddenly, as our lives shape and hopefully increase our understanding, be able to appreciate a book we dismissed earlier; we may suddenly be able to hear its music, to be enraptured by its song.

When, as a college student, I first read Günter Grass’s great novel “The Tin Drum,” I was unable to finish it. It languished on a shelf for fully 10 years before I gave it a second chance, whereupon it became one of my favorite novels of all time: one of the books I would say that I love. It is an interesting question to ask oneself: Which are the books that you truly love? Try it. The answer will tell you a lot about who you presently are.

More...

https://www.nytimes.com/2021/05/24/opin ... 778d3e6de3
KayBur
Posts: 11
Joined: Thu May 27, 2021 12:41 am

Post by KayBur »

I can say that now I am reading what I cannot read 5-7 years ago, I am growing up. Although physically I stopped growing a long time ago, but mentally many people continue to develop all their lives. Why do I say "many"? Because there is a category of people who are confident in their own omniscience and they do not consider it necessary to learn something new, to grow above themselves.
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

MHI. Farman:: Continuous Education
Dar-es-salaam, Tanzania
July 17,2002

"".....also for good health, unity in your families and success in your education, continuous education,even when you're 90, or a 100, or a 110. Keep educating yourself because the domain of human knowledge is expanding every second, and you must not lose the opportunity to benefit from that knowledge and make it serve you in your lives.Khanavadan.""
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Kharadhar Jamatkhana
Karachi Pakistan
24 October 2000

The reality of our world is that education, knowledge, is becoming the single most important driving force in human society. Never before in human society has so much good knowledge, mediocre knowledge and bad knowledge been available to people. You therefore, have to make judgments, careful judgements, as to what knowledge is good, what knowledge is mediocre, and what knowledge is simply not knowledge - - it is simply information which is not good for society. Choose wisely, and choose that knowledge in such a way that it can serve the balanced purpose of your lives. Therefore, the knowledge must be drawn not only from the secular but also from the spiritual. In seeking knowledge, remember that this knowledge will continue to be increasingly important in your lives.
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

How to Remember What You Read

It happens all the time. You read an amazing book, one so packed with wisdom that you think it’s going to change your life forever. Then…it doesn’t. Why? Because when you’re finally in a situation where you could use its insights, you’ve completely forgotten them. Time is our most valuable resource, so we shouldn’t waste it. The investment we make in reading should have a positive, lasting impact on our lives.

Consuming information is not the same as acquiring knowledge. No idea could be further from the truth.

Learning means being able to use new information. The basic process of learning consists of reflection and feedback. We learn facts and concepts through reflecting on experience—our own or others’. If you read something and you don’t make time to think about what you’ve read, you won’t be able to use any of the wisdom you’ve been exposed to.

One of the reasons that we read books is because they offer a rich tapestry of details, allowing us to see the world of the author and go on their journey with them. Our brains can learn not only the author’s ideas but also when their conclusions about how to live are likely to work and when they are likely to fail (thanks to the vast amount of details that authors share about their experiences and thought processes).

But if you only remember six things after reading this article, it should be the following truths about reading:

Quality matters more than quantity. If you read one book a month but fully appreciate and absorb it, you’ll be better off than someone who skims half the library without paying attention.

Speed-reading is bullshit. Getting the rough gist and absorbing the lessons are two different things. Confuse them at your peril.

Book summary services miss the point. A lot of companies charge ridiculous prices for access to vague summaries bearing only the faintest resemblance to anything in the book. Summaries can be a useful jumping-off point to explore your curiosity, but you cannot learn from them the way you can from the original text.*

Fancy apps and tools are not needed. A notebook, index cards, and a pen will do just fine.

We shouldn’t read stuff we find boring. Life is far too short.

Finishing the book is optional. You should start a lot of books and only finish a few of them.

* (By the way, the book summaries we write for Farnam Street members are definitely not in the same class as the standard fare. We make a significant time investment in each one, including reading the book several times and doing background research on the author, context, and content. And we still don’t pretend they’re as valuable as reading the book!)

In this article, we’ll explore multiple strategies for getting more out of what you read. You don’t need to use all these strategies for every book. Using just a couple of them, whether you’re trying to learn a new philosophy or reading a work of fiction, can help you retain more and make deeper connections.

What you read can give you access to untold knowledge. But how you read changes the trajectory of your life.

More...

https://fs.blog/2021/08/remember-books/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Why Poetry Is So Crucial Right Now

This summer, on a lark, I took a course on poetry geared toward Christian leaders. Twelve of us met over Zoom to read poems and discuss the intersection of our faith, vocations and poetry.

We compared George Herbert’s “Prayer” to Christian Wiman’s “Prayer.” We discussed Langston Hughes’s “Island,” Countee Cullen’s “Yet Do I Marvel” and Scott Cairns’s “Musée” to examine suffering and the problem of evil. We read about Philip Larkin’s fear of death and what he sees as the failures of religious belief in his poem “Aubade.” It was my favorite part of the summer.

In our first class, we took turns sharing what drew us to spend time with poetry. I clumsily tried to explain my longing for verse: I hunger for a transcendent reality — the good, the true, the beautiful, those things which somehow lie beyond mere argument. Yet often, as a writer, a pastor and simply a person online, I find that my life is dominated by debate, controversy and near strangers in shouting matches about politics or church doctrine. This past year in particular was marked by vitriol and divisiveness. I am exhausted by the rancor.

In this weary and vulnerable place, poetry whispers of truths that cannot be confined to mere rationality or experience. In a seemingly wrecked world, I’m drawn to Rainer Maria Rilke’s “Autumn” and recall that “there is One who holds this falling/Infinitely softly in His hands.” When the scriptures feel stale, James Weldon Johnson preaches through “The Prodigal Son” and I hear the old parable anew. On tired Sundays, I collapse into Wendell Berry’s Sabbath poems and find rest.

I’m not alone in my interest in this ancient art form. Poetry seems to be making a comeback. According to a 2018 survey by the National Endowment for the Arts, the number of adults who read poetry nearly doubled in five years, marking the highest number on record for the last 15 years. The poet Amanda Gorman stole the show at this year’s presidential inauguration, and her collection “The Hill We Climb” topped Amazon’s best-seller list.

There is not a simple or singular reason for this resurgence. But I think a particular gift of poetry for our moment is that good poems reclaim the power and grace of words.

Words seem ubiquitous now. We carry a world of words with us every moment in our smartphones. We interact with our family and friends through the written word in emails, texts and Facebook posts. But with our newfound ability to broadcast any words we want, at any moment, we can cheapen them.

“Like any other life-sustaining resource,” Marilyn Chandler McEntyre writes in her book “Caring for Words in a Culture of Lies,” “language can be depleted, polluted, contaminated, eroded and filled with artificial stimulants.” She argues that language needs to be rescued and restored, and points us to the practice of reading and writing poetry as one way of doing so. Poems, she says, “train and exercise the imagination” to “wage peace” because “the love of beauty is deeply related to the love of peace.”

Indeed, in our age of social media, words are often used as weapons. Poetry instead treats words with care. They are slowly fashioned into lanterns — things that can illuminate and guide. Debate certainly matters. Arguments matter. But when the urgent controversies of the day seem like all there is to say about life and death or love or God, poetry reminds me of those mysterious truths that can’t be reduced solely to linear thought.

Poetry itself can engage in smart debate, of course. Yet even didactic poetry — poetry that makes an argument — does so in a more creative, meticulous and compelling way than we usually see in our heated public discourse.

Another reason that I think we are drawn to poetry: Poems slow us down. My summer poetry class teacher, Abram Van Engen, an English professor at Washington University in St. Louis, reminded me that poetry is the “art of paying attention.” In an age when our attention is commodified, when corporations make money from capturing our gaze and holding it for as long as possible, many of us feel overwhelmed by the notifications, busyness and loudness of our lives. Poetry calls us back to notice and attend to the embodied world around us and to our internal lives.

In this way, poetry is like prayer, a comparison many have made. Both poetry and prayer remind us that there is more to say about reality than can be said in words though, in both, we use words to try to glimpse what is beyond words. And they both make space to name our deepest longings, lamentations, and loves. Perhaps this is why the poetry of the Psalms became the first prayer book of the church.

I am trying to take up more poetry reading in my daily life. Reading new poems can be intimidating, but I figure that the only way to get poetry really wrong is to avoid it altogether. It helps that poetry is often short and quick to read so I fit it into the corners of my day — a few minutes in bed at night or in the lull of a Saturday afternoon.

During the past school year, with my kids home because of Covid precautions, we would pile books of poetry on our table once a week (Shel Silverstein, Shakespeare, Nikki Grimes, Emily Dickinson), eat cookies, and read poetry aloud. I now try to always keep some books of verse around.

In one of my very favorite poems, “Pied Beauty,” Gerard Manley Hopkins writes of a beauty that is “past change.” In this world where our political, technological and societal landscape shifts at breakneck speed, many of us still quietly yearn for a beauty beyond change. Poetry stands then as a kind of collective cry beckoning us beyond that which even our best words can say.

https://www.nytimes.com/2021/08/29/opin ... 778d3e6de3
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Why You Should Stop Reading News

We spend hours consuming news because we want to be informed. The problem is news doesn’t make us informed. In fact, the more news we consume the more misinformed we become.

News is, by definition, something that doesn’t last. It exists for only a moment before it changes. As news has become easier to distribute and cheaper to produce, the quality has decreased and the quantity has increased, making it nearly impossible to find the signal in the noise

Rarely do we stop to ask ourselves questions about the media we consume: Is this good for me? Is this dense with detailed information? Is this important? Is this going to stand the test of time? Is the person writing someone who is well informed on the issue? Asking those questions makes it clear the news isn’t good for you.

“[W]e’re surrounded by so much information that is of immediate interest to us that we feel overwhelmed by the never-ending pressure of trying to keep up with it all.”
— Nicolas Carr

Here are only a few of the problems with the news:

First, the speed of news delivery has increased. We used to have to wait to get a newspaper or gossip with people in our town to get our news, but not anymore. Thanks to alerts, texts, and other interruptions, news finds us almost the minute it’s published.

Second, the cost to produce news has dropped significantly. Some people write 12 blog posts a day for major newspapers. It’s nearly impossible to write something thoughtful on one topic, let alone 12. Over a year, this works out to writing 2880 articles (assuming four weeks of vacation). The fluency of the person you’re getting your news from in the subject they’re covering is near zero. As a result, you’re filling your head with surface opinions on isolated topics. Because the costs to produce the news have dropped to almost nothing, there is a lot of competition. (Consider the contrast with FS. We write 40 articles a year with 3 writers. It takes a lot of effort to produce timeless content.)

Third, like other purveyors of drugs, producers of news want you to consume more of it. News producers perpetuate a culture of “tune in, don’t miss out, someone knows something you don’t, follow this or you’ll be misinformed, oh wait, look at this!” The time used to consume news comes out of time that could be used for timeless content.

Fourth, the incentives are misaligned. When the news is free, you still need to pay people. If people aren’t paying, advertisers are. And if advertisers are in charge, the incentives change. Page views become the name of the game. More page views mean more revenue. When it comes to page views, the more controversy, the more share-ability, the more enraged you become, the better. For a lot of people who create news (I won’t use the term “journalists” here because I hold them in high regard), the more page views they get, the more they are compensated. A lot of these ads aren’t just static impressions; they’re also transmitting information about you to the advertisers.

“What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”
— Herbert Simon

Most of what you read online today is pointless. It’s not important to living a good life. It’s not going to help you make better decisions. It’s not going to help you understand the world. It’s not dense with information. It’s not going to help you develop deep and meaningful connections with the people around you.

Like a drug, the news is addictive. Not only does it alter your mood, but it keeps you wanting more. Once you start consuming news, it’s hard to stop. The hotels, transportation, and ticketing systems in Disney World are all designed to keep you within the theme park rather than sightseeing elsewhere in Orlando. Similarly, once you’re on Facebook, it does everything possible, short of taking over your computer to prevent you from leaving. But while platforms like Facebook play a role in our excessive media consumption, we are not innocent. Far from it. We want to be well informed. (More accurately, we want to appear to be well informed.) And this is the very weakness that gets manipulated.

“To be completely cured of newspapers, spend a year reading the previous week’s newspapers.”
— Nassim Taleb

Someone we know reads The New Yorker, The New York Times, The Economist, The Wall Street Journal, her local newspaper, and several other publications. She’s addicted. She wants to know everything that’s going on. Like the rest of us, she just wants to have a well-informed opinion. If reading the news makes you well-informed, then not keeping up makes you ignorant.

When you stop reading the news the first thing you notice about people who read the news is how misinformed they are. Often, they cherry-pick one piece of information and give is enormous weight in their opinions. Like the dog that didn’t bark you realize that they have a tiny lens into a big and messy issue.

Another thing you notice is that you weren’t as well informed as you thought. The news didn’t make your opinions more rational, it just made you more confident you were right. Rarely do we read things we disagree with. It’s much easier to just put our walls up and tune out. Filtering our own news like this self-reinforces what we already believe.

At some point in the future, news will likely be tailored for you. Just as your search results are different than mine, your headline for the same article and mine will be different. The word “same” is an important one. It won’t be the same article at all. The author might have written several versions, one tailored to people who are lean left and one for people who lean right. Even the url will be different with each version having a unique url so the publisher can track time on page, headlines that drive clicks, and share-ability.

Another thing you notice is how people who are in the news a lot worry about what the news says about them. Not only does this increase their anxiety but it changes how they think and act. Instead of getting feedback from reality, they crave validation in the printed opinion of others.

Not reading news shows you how often what you thought was your thinking belonged to someone else. Thinking is hard. It’s much easier to let someone else think for you. Without news in my life, I find that I say “I don’t know” more often.

When all you consume is noise, you don’t realize there is a signal. Your attention is valuable. In fact, your attention is so valuable, it might be the most important thing you have. If you know it’s valuable, why would you consume it on something that is irrelevant tomorrow?

Stepping back from news is hard. We’re afraid of silence, afraid to be alone with our thoughts. That’s why we pull out our phones when we’re waiting in line at a coffee shop or the grocery store. We’re afraid to ask ourselves deep and meaningful questions. We’re afraid to be bored. We’re so afraid, that to avoid it, we’ll literally drive ourselves crazy, consuming pointless information.

Can you do something different? I think so. Part of the answer is to spend less time consuming and more time thinking. The other part is to change your information sources from the news. Seek out dense sources of information. Some indicators you’ve found them are timeless content and direct first-hand experience. This means fewer articles and more books.

If you must read the news, read it for the facts and the data, not the opinions.

Let’s close with this quote by Winifred Gallagher: “Few things are as important to your quality of life as your choices about how to spend the precious resource of your free time.”

https://fs.blog/2013/12/stop-reading-news/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Second-Order Thinking: What Smart People Use to Outperform

Image

The Great Mental Models Volumes One https://www.amazon.com/gp/product/B07P7 ... slink=true and Two https://www.amazon.com/gp/product/B085H ... slink=true are out.
Learn more about the project here https://fs.blog/tgmm/ .

Things are not always as they appear. Often when we solve one problem, we end up unintentionally creating another one that’s even worse. The best way to examine the long-term consequences of our decisions is to use second-order thinking.

It’s often easier to identify when people didn’t adequately consider the second and subsequent order impacts. For example, consider a country that, wanting to inspire regime change in another country, funds and provides weapons to a group of “moderate rebels.” Only it turns out that those moderate rebels will become powerful and then go to war with the sponsoring country for decades. Whoops.

“Failing to consider second- and third-order consequences is the cause of a lot of painfully bad decisions, and it is especially deadly when the first inferior option confirms your own biases. Never seize on the first available option, no matter how good it seems, before you’ve asked questions and explored.”
—Ray Dalio

The ability to think through problems to the second, third, and nth order—or what we will call second-order thinking for short—is a powerful tool that supercharges your thinking.

Image

Second-Order Thinking

In his exceptional book, The Most Important Thing, Howard Marks explains the concept of second-order thinking, which he calls second-level thinking.

First-level thinking is simplistic and superficial, and just about everyone can do it (a bad sign for anything involving an attempt at superiority). All the first-level thinker needs is an opinion about the future, as in “The outlook for the company is favorable, meaning the stock will go up.” Second-level thinking is deep, complex and convoluted.

First-order thinking is fast and easy. It happens when we look for something that only solves the immediate problem without considering the consequences. For example, you can think of this as I’m hungry so let’s eat a chocolate bar.

Second-order thinking is more deliberate. It is thinking in terms of interactions and time, understanding that despite our intentions our interventions often cause harm. Second order thinkers ask themselves the question “And then what?” This means thinking about the consequences of repeatedly eating a chocolate bar when you are hungry and using that to inform your decision. If you do this you’re more likely to eat something healthy.

First-level thinking looks similar. Everyone reaches the same conclusions. This is where things get interesting. The road to out-thinking people can’t come from first-order thinking. It must come from second-order thinking. Extraordinary performance comes from seeing things that other people can’t see.

Image

Improving Your Ability To Think

Here are three ways you can use to put second order thinking into practice today.

- Always ask yourself “And then what?”
- Think through time — What do the consequences look like in 10 minutes? 10 months? 10 Years? 1
- Create templates like the second image above with 1st, 2nd, and 3rd order consequences. Identify your decision and think through and write down the consequences. If you review these regularly you’ll be able to help calibrate your thinking.
- (Bonus) If you’re using this to think about business decisions, ask yourself how important parts of the ecosystem are likely to respond. How will employees deal with this? What will my competitors likely do? What about my suppliers? What about the regulators? Often the answer will be little to no impact, but you want to understand the immediate and second-order consequences before you make the decision.

A lot of extraordinary things in life are the result of things that are first-order negative, second order positive. So just because things look like they have no immediate payoff, doesn’t mean that’s the case. All it means is that you’ll have less competition if the second and third order consequences are positive because everyone who thinks at the first order won’t think things through.

Second-order thinking takes a lot of work. It’s not easy to think in terms of systems, interactions, and time. However, doing so is a smart way to separate yourself from the masses.

https://fs.blog/2016/04/second-order-thinking/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Work Required to Have an Opinion

While we all hold an opinion on almost everything, how many of us actually do the work required to have an opinion?

I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.

— Charlie Munger

The work is the hard part, that’s why people avoid it. You have to do the reading. You have to talk to competent people and understand their arguments. You have to think about the key variables and how they interact over time. You have to listen and chase down arguments that run counter to your views. You have to think about how you might be fooling yourself. You have to see the issue from multiple perspectives. You have to think. You need to become your most intelligent critic and have the intellectual honesty to kill some of your best-loved ideas.

We all are learning, modifying, or destroying ideas all the time. Rapid destruction of your ideas when the time is right is one of the most valuable qualities you can acquire. You must force yourself to consider arguments on the other side.

— Charlie Munger

As Rabbi Moses ben Maimon (1135–1204), commonly known as Maimonides, said: “Teach thy tongue to say I do not know, and thou shalt progress.”

Doing the work required to hold an opinion means you can argue against yourself better than others can. Only then can you say, “I can hold this view because I can’t find anyone else who can argue better against my view.”

Great thinkers, like Charles Darwin, did the work. And it’s one of the biggest reasons he’s buried at Westminster Abbey.

Doing the work counteracts our natural desire to seek out only information that confirms what we believe we know.

Doing the work required to hold an opinion means you can argue against yourself better than others can.

When Darwin encountered opinions or facts that ran contrary to his ideas, he endeavored not only to listen but also not to rest until he could either argue better than his challengers or understand how the fact fit. Darwin did the work. It’s wasn’t easy, but that’s the point.

The difference between the people who do the work and the people who just reel off memorized opinions is huge. When you do the work, you can answer the next question. You know when to follow the rules and when they’ll get you in trouble.

When I did my MBA, I was surrounded by people who could answer the test questions. They got good grades — actually, they got great grades but an odd thing happened after school: a lot of those people couldn’t apply their knowledge to problems they hadn’t seen before.

They were chauffeurs — they knew the memorized answer. They couldn’t answer the next question. We’re all chauffeurs in some aspects of our lives. This is why understanding your circle of competence is so critical to living a rational life.

The ability to destroy your ideas rapidly instead of slowly when the occasion is right is one of the most valuable things. You have to work hard on it. Ask yourself what are the arguments on the other side. It’s bad to have an opinion you’re proud of if you can’t state the arguments for the other side better than your opponents. This is a great mental discipline.

— Charlie Munger

Doing the work means you can’t make up your mind with a high degree of confidence right away.

Doing the work forces you to challenge your beliefs because you have to argue from both sides. You become the somewhat impartial judge. What’s on trial is your opinion.

If you want to work with the world rather than against it, one of the major leverage points you can put effort into is how to distinguish between the people who’ve done the work and those who haven’t. The ones who have will pass the Batesian Mimicry Test.

https://fs.blog/2013/04/the-work-requir ... n-opinion/[/i]
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Why We Need the Humanities

The word itself contains the answer


Image
Nagasaki on September 24, 1945, six weeks after the city was destroyed by American atom bomb (Lynn P. Walker, Jr./Wikimedia Commons)

A little over five years ago, a pair of huge, exquisitely crafted L-shaped antennas in Louisiana and Washington State picked up the chirping echo of two black holes colliding in space a billion years ago—and a billion light years away. In that echo, astrophysicists found proof of Einstein’s theory of gravitational waves—at a cost of more than $1 billion. If you ask why we needed this information, what was the use of it, you might as well ask—as Ben Franklin once did—“what is the use of a newborn baby?” Like a newborn’s potential, the value of a scientific discovery is limitless. It cannot be calculated, and it needs no justification.

But the humanities do. Once upon a time, no one asked why we needed to study the humanities because their value was considered self-evident, just like the value of scientific discovery. Now these two values have sharply diverged. Given the staggering cost of a four-year college education, which now exceeds $300,000 at institutions like Dartmouth College (where I taught for nearly 40 years), how can we justify the study of subjects such as literature? The Summer 2021 newsletter of the Modern Language Association reports a troubling statistic about American colleges and universities: from 2009 to 2019, the percentage of bachelor’s degrees awarded in modern languages and literature has plunged by 29 percent. “Where Have All the Majors Gone?” asks the article. But here’s a more pragmatic question: what sort of dividends does the study of literature pay, out there in the real world?

Right now, the readiest answer to this question is that it stretches the mind by exposing it to many different perspectives and thus prepares the student for what is widely thought to be the most exciting job of our time: entrepreneurship. In William Faulkner’s As I Lay Dying, the story of how a rural Mississippi family comes to bury its matriarch is told from 15 points of view. To study such a novel is to be forced to reckon with perspectives that are not just different but radically contradictory, and thus to develop the kind of adaptability that it takes to succeed in business, where the budding entrepreneur must learn how to satisfy customers with various needs and where he or she must also be ever ready to adapt to changing needs and changing times.

But there’s a big problem with this way of justifying the study of literature. If all you want is entrepreneurial adaptability, you can probably gain it much more efficiently by going to business school. You don’t need a novel by Faulkner—or anyone else.

Nevertheless, you could argue that literature exemplifies writing at its best, and thus trains students how to communicate in something other than tweets and text messages. To study literature is not just to see the rules of grammar at work but to discover such things as the symmetry of parallel structure and the concentrated burst of metaphor: two prime instruments of organization. Henry Adams once wrote that “nothing in education is more astonishing than the amount of ignorance it accumulates in the form of inert facts.” Literature shows us how to animate facts, and still more how to make them cooperate, to work and dance together toward revelation.

Yet literature can be highly complex. Given its complexity, given all the ways in which poems, plays, and novels resist as well as provoke our desire to know what they mean, the study of literature once again invites the charge of inefficiency. If you just want to know how to make the written word get you a job, make you a sale, or charm a venture capitalist, you don’t need to study the gnomic verses of Emily Dickinson or the intricate ironies of Jonathan Swift. All you need is a good textbook on writing and plenty of practice.

Why then do we really need literature? Traditionally, it is said to teach us moral lessons, prompting us to seek “the moral of the story.” But moral lessons can be hard to extract from any work of literature that aims to tell the truth about human experience—as, for instance, Shakespeare does in King Lear. In one scene of that play, a foolish but kindly old man has his eyes gouged out. And at the end of the play, what happens to the loving, devoted, long-suffering Cordelia—the daughter whom Lear banishes in the first act? She dies, along with the old king himself. So even though all the villains in the play are finally punished by death, it is not easy to say why Cordelia too must die, or what the moral of her death might be.

Joseph Conrad once declared that his chief aim as a novelist was to make us see. Like Shakespeare, he aimed to make us recognize and reckon with one of the great contradictions of humanity: that only human beings can be inhumane. Only human beings take children from their parents and lock them in cages, as American Border Patrol agents did to Central American children two years ago; only human beings burn people alive, as ISIS has done in our own time; only human beings use young girls as suicide bombers, as Boko Haram did 44 times in one recent year alone.

As a refuge from such horrors, literature can offer us visions or at least glimpses of beauty, harmony, and love. They are part of what Seamus Heaney called “the redress of poetry”—compensation for the misery, cruelty, and brutality that human beings ceaselessly inflict on one another. But literature at its most powerful is never just a balloon ride to fantasy, a trip to the moon on gossamer wings. Rather than taking flight from our inhumanity, great literature confronts it even while somehow keeping alive its faith in our humanity. What is the moral of Toni Morrison’s novel Beloved, the story of a formerly enslaved Black woman who killed her own infant daughter to spare her from a life of slavery and sexual exploitation? In a world of merciless inhumanity, can infanticide become an expression of love?

This is the kind of question literature insists on asking. At the heart of the humanities lies humanity, which stubbornly insists on measuring everything in terms of its impact on human life. Seventy-six years ago, J. Robert Oppenheimer midwifed the birth of the most destructive weapon the world had ever seen—a weapon that made America invincible, ended World War II, and saved countless American lives. But the atomic bombs that America dropped on Hiroshima and Nagasaki incinerated more than 200,000 men, women, and children. That is why Oppenheimer said afterward: “In some sort of crude sense which no vulgarity, no humor, no overstatement can quite extinguish, the physicists have known sin; and this is a knowledge which they cannot lose.”

In saying this, Oppenheimer was not just radically unscientific. He was potentially treasonous, disloyal to a government bent on military supremacy above all else. Refusing to join the next heat in the arms race, the development of the hydrogen bomb, Oppenheimer lost his security clearance and spent the rest of his life under a cloud of suspicion.

But his response to the bombing of Hiroshima and Nagasaki demonstrates the kind of humanity that the humanities aim to nurture. We need this humanity now more than ever, when the diabolical cruelty of terrorism is compounded by the destructiveness of our very own drone strikes, which too often hit not only the guilty but also the innocent—the victims of “collateral damage,” the human life we sacrifice to our military ends.

We need literature to bear witness to such sacrifices—the lives we take and also the minds we deform in the process of making war. One of those minds is portrayed in a book called Redeployment, a collection of stories about American soldiers in Iraq written by Phil Klay, a veteran U.S. Marine officer. In one of his stories, a lance corporal says to a chaplain, “The only thing I want to do is kill Iraqis. That’s it. Everything else is just, numb it until you can do something. Killing hajjis is the only thing that feels like doing something. Not just wasting time.”

Where is the humanity here? This soldier has just enough left to realize that he has been weaponized, turned into a killing machine. Literature thus strives to speak both for and to whatever shred of humanity may survive the worst of our ordeals. In The Plague, a novel he wrote during the Second World War, Albert Camus symbolically portrays the war as a bubonic plague striking an Algerian city. The story is told by a doctor who struggles—often in vain—to save all the lives he can, though hundreds of men, women, and children will die before the plague has run its course. In the end, he says, this tale records what had to be done and what must be “done again in the never-ending fight against terror and its relentless onslaughts.”

If these words seem uncannily resonant for our time, consider what the doctor says about how the fight against terror must be waged. “Despite their personal afflictions,” he says, it must be waged “by all who, while unable to be saints but refusing to bow down to pestilences, strive their utmost to be healers.”

Having spent trillions of dollars fighting terrorism with bullets and bombs, we need literature and the humanities now more than ever, because they strive to heal, to nurture the most priceless of all our possessions: our humanity.

https://theamericanscholar.org/why-we-n ... urce=email
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Video Quote: Dangers of Over Specialization in Knowledge

Image

Video:

https://www.youtube.com/watch?v=_mw6xOXloHo
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

You Are Not Who You Think You Are

You may think you understand the difference between seeing something and imagining it. When you see something, it’s really there; when you imagine it, you make it up. That feels very different.

The problem is that when researchers ask people to imagine something, like a tomato, and then give some of them a just barely visible image of a tomato, they find that the process of imagining it is hard to totally separate from the process of seeing it. In fact, they use a lot of the same brain areas.

And when you stop to think about it, that makes some sense. Your brain is locked in the pitch-black bony vault of your skull, trying to use scraps of information to piece together the world. Even when it’s seeing, it’s partly constructing what’s out there based on experience. “It turns out, reality and imagination are completely intermixed in our brain,” Nadine Dijkstra writes in Nautilus, “which means that the separation between our inner world and the outside world is not as clear as we might like to think.”

We grew up believing that “imagining” and “seeing” describe different mental faculties. But as we learn more about what’s going on in the mind, these concepts get really blurry really fast.

This is happening all over the place. Over the centuries, humans have come up with all sorts of concepts to describe different thinking activities: memory, perception, emotion, attention, decision-making. But now, as scientists develop greater abilities to look at the brain doing its thing, they often find that the activity they observe does not fit the neat categories our culture has created, and which we rely on to understand ourselves.

Let me give you a few more examples:

Reason/Emotion. It feels as if the rational brain creates and works with ideas, but that emotions sweep over us. But some neuroscientists, like Lisa Feldman Barrett of Northeastern University, argue that people construct emotions and thoughts, and there is no clear distinction between them. It feels as if we can use our faculty of reason to restrain our passions, but some neuroscientists doubt this is really what’s happening. Furthermore, emotions assign value to things, so they are instrumental to reason, not separate from or opposed to it.

Observation/Memory. Observation feels like a transparent process. You open your eyes and take stuff in. In fact, much or most of seeing is making mental predictions about what you expect to see, based on experience, and then using sensory input to check and adjust your predictions. Thus, your memory profoundly influences what you see. “Perceptions come from the inside out just as much, if not more, than from the outside in,” the University of Sussex neuroscientist Anil Seth has observed. The conversation between senses and memory produces what he calls a “controlled hallucination,” which is the closest we can get to registering reality.

Understanding/Experiencing. Understanding seems cognitive; you study something and figure it out. Experience seems sensory; you physically live through some event. But Mark Johnson, now a professor emeritus in the University of Oregon’s Department of Philosophy, points out that there is no such thing as disembodied understanding. Your neural, chemical and bodily responses are in continual conversation with one another, so both understanding and experiencing are mental and physical simultaneously. “When faced with a whole person,” Joe Gough, a Ph.D. student in philosophy at the University of Sussex, writes, “we shouldn’t think that they can be divided into a ‘mind’ and a ‘body.’”

Self-control. We talk as if there’s a thing called self-control, or self-regulation, or grit. But the Stanford psychology professor Russell Poldrack tells me that when you give people games to measure self-control in a lab, the results do not predict whether they will be able to resist alcohol or drug use in the real world. This suggests, Poldrack says, that what we believe is “self-control” may really be a bunch of different processes.

Jordana Cepelewicz recently had an excellent essay on this broad conceptual challenge in Quanta Magazine. “You realize that neither the term ‘decision-making’ nor the term ‘attention’ actually corresponds to a thing in the brain,” the University of Montreal neuroscientist Paul Cisek told her. She also reported that some in the field believe that the concepts at the core of how we think about thinking need to be radically revised.

That seems exciting. I’ve long wondered if in 50 years terms like “emotion” or “reason” will be obsolete. Some future genius will have come up with an integrative paradigm that more accurately captures who we are and how we think.

I love how holistic the drift of research is. For a while, neuroscientists spent a lot of time trying to figure out what region of the brain did what function. (Fear is in the amygdala!) Today they also look at the ways vast networks across the brain, body and environment work together to create comprehensive mental states. Now there is much more emphasis on how people and groups creatively construct their own realities, and live within their own constructions.

I’ve often told young people to study genetics. That will clearly be important. But I’m realizing we all need to study this stuff, too. Big, exciting changes are afoot.

https://www.nytimes.com/2021/09/02/opin ... pe=Article

********
The Awesome Importance of Imagination

Plato and Aristotle disagreed about the imagination. As the philosopher Stephen Asma and the actor Paul Giamatti pointed out in an essay in March, Plato gave the impression that imagination is a somewhat airy-fairy luxury good. It deals with illusions and make-believe and distracts us from reality and our capacity to coolly reason about it. Aristotle countered that imagination is one of the foundations of all knowledge.

One tragedy of our day is that our culture hasn’t fully realized how much Aristotle was correct. Our society isn’t good at cultivating the faculty that we may need the most.

What is imagination? Well, one way of looking at it is that every waking second your brain is bombarded with a buzzing, blooming confusion of colors, shapes and movements. Imagination is the capacity to make associations among all these bits of information and to synthesize them into patterns and concepts. When you walk, say, into a coffee shop you don’t see an array of surfaces, lights and angles. Your imagination instantly coalesces all that into an image: “coffee shop.”

Neuroscientists have come to appreciate how fantastically complicated and subjective this process of creating mental images really is. You may think perception is a simple “objective” process of taking in the world and cognition is a complicated process of thinking about it. But that’s wrong.

Perception — the fast process of selecting, putting together, interpreting and experiencing facts, thoughts and emotions — is the essential poetic act that makes you you.

For example, you don’t see the naked concept “coffee shop.” The image you create is coated with personal feelings, memories and evaluations. You see: “slightly upscale suburban coffee shop trying and failing to send off a hipster vibe.” The imagination, Charles Darwin wrote, “unites former images and ideas, independently of the will, and thus creates brilliant and novel results.”

Furthermore, imagination can get richer over time. When you go to Thanksgiving dinner, your image of Uncle Frank contains the memories of past Thanksgivings, the arguments and the jokes, and the whole sum of your common experiences. The guy you once saw as an insufferable blowhard you now see — as your range of associations has widened and deepened — as a decent soul struggling with his wounds. “A fool sees not the same tree that a wise man sees,” William Blake observed.

Can you improve your imagination? Yes. By creating complex and varied lenses through which to see the world. The novelist Zadie Smith once wrote that when she was a girl she was constantly imagining what it would be like to grow up in the homes of her friends.

“I rarely entered a friend’s home without wondering what it might be like to never leave,” she wrote in The New York Review of Books. “That is, what it would be like to be Polish or Ghanaian or Irish or Bengali, to be richer or poorer, to say these prayers or hold those politics. I was an equal-opportunity voyeur. I wanted to know what it was like to be everybody. Above all, I wondered what it would be like to believe the sorts of things I didn’t believe.”

What an awesome way to prepare the imagination for the kind of society we all now live in.

Zora Neale Hurston grew up by a main road in Eatonville, Fla. As a young girl she’d walk up to carriages passing by and call out, “Don’t you want me to go a piece of the way with you?” She’d get invited into the carriage, have a conversation with strangers for a while and then walk back home.

These kinds of daring social adventures were balanced, in Hurston’s case, and in the case of many people with cultivated imaginations, with long periods of reading and solitude and inner adventures in storytelling. “I lived an exciting life unseen,” Hurston later recalled.

A person who feeds his or her imagination with a fuller repertoire of thoughts and experiences has the ability not only to see reality more richly but also — even more rare — to imagine the world through the imaginations of others. This is the skill we see in Shakespeare to such a miraculous degree — his ability to disappear into his characters and inhabit their points of view without ever pretending to explain them.

Different people have different kinds of imagination. Some people mainly focus on the parts of the world that can be quantified. This prosaic form of pattern recognition can be very practical. But it often doesn’t see the subjective way people coat the world with values and emotions and aspirations, which is exactly what we want to see if we want to glimpse how they experience their experience.

Blake and others aspired to the most enchanted form of imagination, which as Mark Vernon writes in Aeon, “bridges the subjective and objective, and perceives the interior vitality of the world as well as its interconnecting exteriors.” This is van Gogh painting starry nights and Einstein imagining himself riding alongside a light beam.

Imagination helps you perceive reality, try on other realities, predict possible futures, experience other viewpoints. And yet how much do schools prioritize the cultivation of this essential ability?

What happens to a society that lets so much of its imaginative capacity lie fallow? Perhaps you wind up in a society in which people are strangers to one another and themselves.

https://www.nytimes.com/2021/11/11/opin ... 778d3e6de3
Post Reply