Lapidarium RSS

Amira's favorite quotes

"Everything you can imagine is real."— Pablo Picasso

Lapidarium notes

Tags:

Ancient
Age of information
Anthropology
Art
Artificial intelligence
Astronomy
Atheism
Beauty
Biography
Books
Buddism
China
Christianity
Civilization
Cognition, relativity
Cognitive science
Collective intelligence
Communication
Consciousness
Creativity
Culture
Curiosity
Cyberspace
Definitions
Democracy
Documentary
Drawing
Earth
Economy
Evolution
Friendship
Funny
Genetics
Globalization
Greek & Latin
Happiness
History
Human being
Illustrations
Imagination
Individualism
Information
Inspiration
Internet
Knowledge
Language
Learning
Life
Literature
Logic
Love
Mathematics
Media
Metaphor
Mind & Brain
Morality
Multiculturalism
Music
Networks
Neuroscience
Painting
Paradoxes
Patterns
Philosophy
Poetry
Politics
Physics
Psychology
Rationalism
Reading
Religions
Science
Science & Art
Self improvement
Semantics
Singularity
Society
Sociology
Storytelling
Technology
The other
Time
Traveling
USA
Unconsciousness
Universe
Writing
Video
Violence
Visualization


Pensieri a caso
Photography
A Box Of Stories
Homepage

Twitter
Facebook

Contact

Archive

Mar
27th
Wed
permalink
" "Daniel C. Dennett favours the theory (first suggested by Richard Dawkins) that our social learning has given us a second information highway (in addition to the genetic highway) where the transmission of variant cultural information (memes) takes place via differential replication. Software viruses, for example, can be understood as memes, and as memes evolve in complexity, so does human cognition: “The mind is the effect, not the cause.” (…)

Daniel Dennett: “Natural selection is not gene centrist and nor is biology all about genes, our comprehending minds are a result of our fast evolving culture. Words are memes that can be spoken and words are the best example of memes. Words have a genealogy and it’s easier to trace the evolution of a single word than the evolution of a language.”
Daniel C. Dennett is University Professor, Professor of Philosophy, and Co-Director of the Center for Cognitive Studies at Tufts University, Daniel Dennett: ‘I don’t like theory of mind’ – interview, The Guardian, 22 March 2013.
See also:  Daniel C. Dennett on an attempt to understand the mind; autonomic neurons, culture and computational architecture, Lapidarium notes
Feb
15th
Fri
permalink

"We’re beginning to come to grips with the idea that your brain is not this well-organized hierarchical control system where everything is in order, a very dramatic vision of bureaucracy. In fact, it’s much more like anarchy with some elements of democracy. Sometimes you can achieve stability and mutual aid and a sort of calm united front, and then everything is hunky-dory, but then it’s always possible for things to get out of whack and for one alliance or another to gain control, and then you get obsessions and delusions and so forth.

You begin to think about the normal well-tempered mind, in effect, the well-organized mind, as an achievement, not as the base state. (…) You’re going to have a parallel architecture because, after all, the brain is obviously massively parallel.

It’s going to be a connectionist network. (…) [Y]ou begin to realize that control in brains is very different from control in computers. (…) Each neuron is imprisoned in your brain. I now think of these as cells within cells, as cells within prison cells. Realize that every neuron in your brain, every human cell in your body (leaving aside all the symbionts), is a direct descendent of eukaryotic cells that lived and fended for themselves for about a billion years as free-swimming, free-living little agents. They fended for themselves, and they survived.

They had to develop an awful lot of know-how, a lot of talent, a lot of self-protective talent to do that. When they joined forces into multi-cellular creatures, they gave up a lot of that. They became, in effect, domesticated. They became part of larger, more monolithic organizations. (…)

Maybe a lot of the neurons in our brains are not just capable but, if you like, motivated to be more adventurous, more exploratory or risky in the way they comport themselves, in the way they live their lives. They’re struggling amongst themselves with each other for influence, just for staying alive, and there’s competition going on between individual neurons. As soon as that happens, you have room for cooperation to create alliances, and I suspect that a more free-wheeling, anarchic organization is the secret of our greater capacities of creativity, imagination, thinking outside the box and all that, and the price we pay for it is our susceptibility to obsessions, mental illnesses, delusions and smaller problems.

We got risky brains that are much riskier than the brains of other mammals even, even more risky than the brains of chimpanzees, and that this could be partly a matter of a few simple mutations in control genes that release some of the innate competitive talent that is still there in the genomes of the individual neurons. But I don’t think that genetics is the level to explain this. You need culture to explain it.”

Daniel C. Dennett is University Professor, Professor of Philosophy, and Co-Director of the Center for Cognitive Studies at Tufts University, Daniel C. Dennett on an attempt to understand the mind; autonomic neurons, culture and computational architecture, Lapidarium notes, 2013.
Jan
21st
Mon
permalink
A database can be listed; a human mind has to be stimulated.
Dave Snowden is a Welsh academic, consultant, and researcher in the field of knowledge management, The Ashen Model, (pdf), p.4. (tnx johntropea)
Dec
27th
Thu
permalink
David Deutsch on Artificial Intelligence

“What is needed is nothing less than a breakthrough in philosophy, a theory that explains how brains create explanations. (…)

What distinguishes human brains from all other physical systems is qualitatively different from all other functionalities, and cannot be specified in the way that all other attributes of computer programs can be. It cannot be programmed by any of the techniques that suffice for writing any other type of program. Nor can it be achieved merely by improving their performance at tasks that they currently do perform, no matter by how much. Why? I call the core functionality in question creativity: the ability to produce new explanations. (…)

What is needed is nothing less than a breakthrough in philosophy, a new epistemological theory that explains how brains create explanatory knowledge and hence defines, in principle, without ever running them as programs, which algorithms possess that functionality and which do not. (…)

The truth is that knowledge consists of conjectured explanations — guesses about what really is (or really should be, or might be) out there in all those worlds. Even in the hard sciences, these guesses have no foundations and don’t need justification. Why? Because genuine knowledge, though by definition it does contain truth, almost always contains error as well. So it is not ‘true’ in the sense studied in mathematics and logic. Thinking consists of criticising and correcting partially true guesses with the intention of locating and eliminating the errors and misconceptions in them, not generating or justifying extrapolations from sense data. And therefore, attempts to work towards creating an AGI that would do the latter are just as doomed as an attempt to bring life to Mars by praying for a Creation event to happen there. (…)

Present-day software developers could straightforwardly program a computer to have ‘self-awareness’ if they wanted to. But it is a fairly useless ability.” “
David Deutsch, British physicist at the University of Oxford, Creative blocks, aeon, Oct 3, 2012.
See also: ☞ David Deutsch: A new way to explain explanation, Lapidarium notes
Dec
7th
Fri
permalink
Philosophy, art, and science are not the mental objects of an objectified brain but the three aspects under which the brain becomes subject.
Gilles Deleuze, French philosopher (1925-1995), What Is Philosophy?, Verso, 1994, p. 210.
permalink

“Through this demonstration we learn that neither light, nor eye, nor brain, alone or in association, can see. But rather, we see only through the total coordination of human experiences; and even then, it is our own conceived image, and not really the actual object which we perceive. We learn, therefore, that we see by creative ability and not by mechanical reproduction.” “
Frederick Kiesler, Austrian-American sculptor, theater designer, artist, theoretician and architect (1890-1965), Vision Machine,  ”…seeing the act of looking “ (tnx the-rx)
Jul
16th
Mon
permalink
Capacity of the human mind:

The optimistic position on the capacity of the human mind vis-à-vis the cosmos was nicely summed up by Emily Dickinson. The Brain—is wider than the Sky—,” she wrote sometime around 1862, a generation before Einstein was born:

For—put them side by side—
The one the other will contain
With ease—and You—beside
Kathryn Schultz, American journalist and author, Book Review: Schulz on Jim Holt’s Why Does the World Exist?, Vulture, July 8, 2012. (tnx johnsparker)
Jun
20th
Wed
permalink
The crucial point is that everything that we see in the right half of our vision is processed in the left hemisphere of our brain, and everything we see in the left half is processed by the right hemisphere. And for most of us, the left brain is stronger at processing language. So perhaps the language savvy half of our brain is helping us out. (…) Among those who were the fastest at identifying the odd color, English speakers showed no left brain / right brain distinction, whereas Korean speakers did. It’s plausible that their left brain was attuned to the distinction between yeondu and chorok. (…)

Language is somehow enhancing your left brain’s ability to discern different colors with different names. Cultural forces alter our perception in ever so subtle a way, by gently tugging our visual leanings in different directions. (…) As infant brains are rewiring themselves to absorb our visual language, the seat of categorical processing jumps hemispheres from the right brain to the left. And it stays here throughout adulthood. Their brains are furiously re-categorizing the world, until mysteriously, something finally clicks into place.
Jun
3rd
Sun
permalink
I’ve long suspected, based on observations of myself as well as observations of society, that, beyond the psychological and cognitive strains produced by what we call information overload, there is a point in intellectual inquiry when adding more information decreases understanding rather than increasing it.

[Nassim Nicholas] Taleb’s observation that as the frequency of information sampling increases, the amount of noise we take in expands more quickly than the amount of signal might help to explain the phenomenon, particularly if human understanding hinges as much or more on the noise-to-signal ratio of the information we take in as on the absolute amount of signal we’re exposed to. Because we humans seem to be natural-born signal hunters, we’re terrible at regulating our intake of information. We’ll consume a ton of noise if we sense we may discover an added ounce of signal. So our instinct is at war with our capacity for making sense.
Nicholas Carr, American writer who has published books and articles on technology, business, and culture, A little more signal, a lot more noise, Rough Type, May 30, 2012.
May
23rd
Wed
permalink

Bruce Hood on The Self Illusion: How the Brain Creates Identity


I think that both the “I” and the “me” are actually ever-changing narratives generated by our brain to provide a coherent framework to organize the output of all the factors that contribute to our thoughts and behaviors.

I think it helps to compare the experience of self to subjective contours – illusions such as the Kanizsa pattern where you see an invisible shape that is really defined entirely by the surrounding context. People understand that it is a trick of the mind but what they may not appreciate is that the brain is actually generating the neural activation as if the illusory shape was really there. In other words, the brain is hallucinating the experience. There are now many studies revealing that illusions generate brain activity as if they existed. They are not real but the brain treats them as if they were. (…)

Me is similarly constructed, though we may be more aware of the events that have shaped it over our lifetime. But neither is cast in stone and both are open to all manner of reinterpretation. As artists, illusionists, movie makers, and more recently experimental psychologists have repeatedly shown, conscious experience is highly manipulatable and context dependent. Our memories are also largely abstracted reinterpretations of events – we all hold distorted memories of past experiences. (…)

By rejecting the notion of a core self and considering how we are a multitude of competing urges and impulses, I think it is easier to understand why we suddenly go off the rails. It explains why we act, often unconsciously, in a way that is inconsistent with our self image – or the image of our self as we believe others see us.

That said, the self illusion is probably an inescapable experience we need for interacting with others and the world, and indeed we cannot readily abandon or ignore its influence, but we should be skeptical that each of us is the coherent, integrated entity we assume we are. (…)

There’s nothing at the center. We’re the product of the emergent property, I would argue, of the multitude of these processes that generate us.” “
Bruce Hood Canadian-born experimental psychologist who specialises in developmental cognitive neuroscience, Director of the Bristol Cognitive Development Centre, based at the University of Bristol, cited in ☞ The Self Illusion: How the Brain Creates Identity, Lapidarium notes, May, 2012. (Illustration source)
May
21st
Mon
permalink
Paul King on what is the best explanation for identity

From the perspective of neuroscience, personal identity is what happens when the brain forms of a model of the environment that includes a first-person perspective and narrative history.

Eric Kandel, lead editor of the textbook Principles of Neural Science, and winner of a Nobel Prize for work on the neural basis of memory, calls memory the “neural basis of individuation.” And it is. For without memory, we could not each carry around a unique sense of self, formed from a differentiated life history.

If everyone on the planet woke up one day with amnesia, human beings would be a herd of mostly undifferentiated people. Without the ability to distinguish one person from another, or remember unique histories or events, everyone becomes a vague blur of humanity.

In addition to our sense of unique personal history, the brain also maintains a model of other people. “Theory of mind" in cognitive science refers to the brain’s ability to model and track the goals, beliefs, and behavior patterns of other human beings around us in a social context. With a little introspection, this model of others can extend to ourself. As one comedian quipped: "How can I know what I think until I hear what I say?"

Because everyone in society carries around a model of themselves and the others they know, all the brains in human society collectively comprise a substrate for the distributed representation of human identity. Our identity is shaped not only by our own beliefs about ourselves, but by what others think of us as well. Social roles are collectively determined, and personality is shaped by how others treat us as well as are predisposition to a certain character and temperament.

And lastly, while personal identity feels unique, unified, and permanent, it is not. Identical twins are often confused. In institutions, people are identified by role (e.g. sales representative for the western region) while the actual person may change. And someone’s personality can change with mood. In children, we see personal identity form, and in senior dementia, we see it unravel.” “
Paul King, visiting scholar at the Redwood Center for Theoretical Neuroscience at University of California, Berkeley, working on computational models of vision, What is the best explanation for identity (in a philosophical, neuroscientific, or psychological sense)?, Quora, Jan 18, 2012. (tnx wildcat2030)
Apr
29th
Sun
permalink
Culture does leave its signature in the circuitry of the individual brain. If you were to examine an acorn by itself, it could tell you a great deal about its surroundings – from moisture to microbes to the sunlight conditions of the larger forest. By analogy, an individual brain reflects its culture. Our opinions on normality, custom, dress codes and local superstitions are absorbed into our neural circuitry from the social forest around us. To a surprising extent, one can glimpse a culture by studying a brain. Moral attitudes toward cows, pigs, crosses and burkas can be read from the physiological responses of brains in different cultures.
David Eagleman, neuroscientist at Baylor College of Medicine, where he directs the Laboratory for Perception and Action, bestselling author, ☞ David Eagleman on how we constructs reality, time perception, and The Secret Lives of the Brain, Lapidarium notes, The Observer, 29 April 2012.
Apr
14th
Sat
permalink
Probably 99.999 percent of what goes on in the brain is automatic and unconscious. I have no idea what my next sentence will be, and sometimes I sound like it. (…) We think the other stuff, the ‘me,’ the ‘self,’ — we think that’s really important. We think there is somebody in charge —somebody pulling the levers. (…)

The brain is automatic but people are free. You are responsible. Get over it.”

Free will is not a useful concept at the level of brain biology, to summarize Gazzaniga, because the biology is fixed. We cannot control our brains. It is at the level of interactions between people where concepts like responsibility and justice can be addressed. Gazzaniga compared the problem to an analysis of traffic, which cannot be achieved by studying individual cars. “Traffic only exists in the interaction,” he said.
Michael Gazzaniga, professor of psychology at the University of California, Santa Barbara, where he heads the new SAGE Center for the Study of the Mind, cited in Can we have free will, if the brain’s actions are automatic? A scholar makes the case, Capital New York, Apr 13, 2012.
Apr
10th
Tue
permalink
At some time in the history of the universe, there were no human minds, and at some time later, there were. Within the blink of a cosmic eye, a universe in which all was chaos and void came to include hunches, beliefs, sentiments, raw sensations, pains, emotions, wishes, ideas, images, inferences, the feel of rubber, Schadenfreude, and the taste of banana ice cream.
David Berlinski, American author, a Senior Fellow of the Discovery Institute’s Center for Science and Culture, On the Origins of the Mind (pdf), Discovery Institute, 2004
Mar
21st
Wed
permalink
The miracle of your mind isn’t that you can see the world as it is, but that you can see the world as it isn’t. We can remember the past and we can think about the future, and we can imagine what it’s like to be some other person in some other place. And we all do this differently.
Kathryn Schulz, American journalist and author, Kathryn Schulz: On being wrong, TED talk [12:00-12:17], Mar 2011.