Skip to main content

Home/ TOK Friends/ Group items matching "discussion" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

What Happened Before the Big Bang? The New Philosophy of Cosmology - Ross Andersen - Technology - The Atlantic - 1 views

  • This question of accounting for what we call the "big bang state" -- the search for a physical explanation of it -- is probably the most important question within the philosophy of cosmology, and there are a couple different lines of thought about it.
  • One that's becoming more and more prevalent in the physics community is the idea that the big bang state itself arose out of some previous condition, and that therefore there might be an explanation of it in terms of the previously existing dynamics by which it came about
  • The problem is that quantum mechanics was developed as a mathematical tool. Physicists understood how to use it as a tool for making predictions, but without an agreement or understanding about what it was telling us about the physical world. And that's very clear when you look at any of the foundational discussions. This is what Einstein was upset about; this is what Schrodinger was upset about. Quantum mechanics was merely a calculational technique that was not well understood as a physical theory. Bohr and Heisenberg tried to argue that asking for a clear physical theory was something you shouldn't do anymore. That it was something outmoded. And they were wrong, Bohr and Heisenberg were wrong about that. But the effect of it was to shut down perfectly legitimate physics questions within the physics community for about half a century. And now we're coming out of that
  • ...9 more annotations...
  • One common strategy for thinking about this is to suggest that what we used to call the whole universe is just a small part of everything there is, and that we live in a kind of bubble universe, a small region of something much larger
  • Newton realized there had to be some force holding the moon in its orbit around the earth, to keep it from wandering off, and he knew also there was a force that was pulling the apple down to the earth. And so what suddenly struck him was that those could be one and the same thing, the same force
  • That was a physical discovery, a physical discovery of momentous importance, as important as anything you could ever imagine because it knit together the terrestrial realm and the celestial realm into one common physical picture. It was also a philosophical discovery in the sense that philosophy is interested in the fundamental natures of things.
  • There are other ideas, for instance that maybe there might be special sorts of laws, or special sorts of explanatory principles, that would apply uniquely to the initial state of the universe.
  • The basic philosophical question, going back to Plato, is "What is x?" What is virtue? What is justice? What is matter? What is time? You can ask that about dark energy - what is it? And it's a perfectly good question.
  • right now there are just way too many freely adjustable parameters in physics. Everybody agrees about that. There seem to be many things we call constants of nature that you could imagine setting at different values, and most physicists think there shouldn't be that many, that many of them are related to one another. Physicists think that at the end of the day there should be one complete equation to describe all physics, because any two physical systems interact and physics has to tell them what to do. And physicists generally like to have only a few constants, or parameters of nature. This is what Einstein meant when he famously said he wanted to understand what kind of choices God had --using his metaphor-- how free his choices were in creating the universe, which is just asking how many freely adjustable parameters there are. Physicists tend to prefer theories that reduce that number
  • You have others saying that time is just an illusion, that there isn't really a direction of time, and so forth. I myself think that all of the reasons that lead people to say things like that have very little merit, and that people have just been misled, largely by mistaking the mathematics they use to describe reality for reality itself. If you think that mathematical objects are not in time, and mathematical objects don't change -- which is perfectly true -- and then you're always using mathematical objects to describe the world, you could easily fall into the idea that the world itself doesn't change, because your representations of it don't.
  • physicists for almost a hundred years have been dissuaded from trying to think about fundamental questions. I think most physicists would quite rightly say "I don't have the tools to answer a question like 'what is time?' - I have the tools to solve a differential equation." The asking of fundamental physical questions is just not part of the training of a physicist anymore.
  • The question remains as to how often, after life evolves, you'll have intelligent life capable of making technology. What people haven't seemed to notice is that on earth, of all the billions of species that have evolved, only one has developed intelligence to the level of producing technology. Which means that kind of intelligence is really not very useful. It's not actually, in the general case, of much evolutionary value. We tend to think, because we love to think of ourselves, human beings, as the top of the evolutionary ladder, that the intelligence we have, that makes us human beings, is the thing that all of evolution is striving toward. But what we know is that that's not true. Obviously it doesn't matter that much if you're a beetle, that you be really smart. If it were, evolution would have produced much more intelligent beetles. We have no empirical data to suggest that there's a high probability that evolution on another planet would lead to technological intelligence.
maxwellokolo

Witnessing Fear in Others Can Physically Change the Brain - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
bennetttony

I escaped Nazi Germany. I see its ideology alive in America today - 1 views

  •  
    This article discusses many of the similarities that America has with Nazi Germany. I thought that this was interesting to read because it comes from the point of view from someone that was actually there.
bennetttony

Nearly 50% of Trump fans believe Clinton's involved in pedophilia - 0 views

  •  
    Nearly half of all Donald Trump voters believe a widely debunked conspiracy theory claiming that Hillary Clinton is involved in a child sex ring run out of a popular Washington, D.C. pizzeria, a recent poll suggests. I thought that this article was interesting because it shows how misinformed people are, possibly due to fake news, just like we discussed.
charlottedonoho

Beware Eurosceptic versions of history and science| Rebekah Higgitt | Science | The Guardian - 1 views

  • Readers of the Guardian Science pages may not have noticed the group called Historians for Britain, or a recent piece in History Today by David Abulafia asserting their belief “that Britain’s unique history sets it apart from the rest of Europe”.
  • It requires critical scrutiny from everyone with an interest in Britain’s relationship with the rest of the world, and in evidence-based political discussion.
  • Abilafia’s article is a classic example of an old-fashioned “Whiggish” narrative. It claims a uniquely moderate and progressive advance toward the development of British institutions, traced continuously from Magna Carta and isolated from the rages and radicalism of the Continent.
  • ...3 more annotations...
  • The answer is not “because Britain is better and unique” but “because I am British and these are the stories I have been brought up on” at school, university, on TV and elsewhere. Go to another country and you will see that they have their own, equally admirable, pantheon of greats.
  • The area that I have been working on, the eighteenth-century search for longitude, likewise reveals the need to challenge nationalistic assumptions.
  • Historians and readers of history both need to be aware of the biases of our education and literature. Accounts of British exceptionalism, especially those that lump the rest of Europe or the world into an amorphous group of also-rans, are more the result of national tradition and wishful thinking than a careful reading of the sources.
Javier E

A Harvard Scholar on the Enduring Lessons of Chinese Philosophy - The New York Times - 0 views

  • Since 2006, Michael Puett has taught an undergraduate survey course at Harvard University on Chinese philosophy, examining how classic Chinese texts are relevant today. The course is now one of Harvard’s most popular, third only to “Introduction to Computer Science” and “Principles of Economics.”
  • So-called Confucianism, for example, is read as simply being about forcing people to accept their social roles, while so-called Taoism is about harmonizing with the larger natural world. So Confucianism is often presented as bad and Taoism as good. But in neither case are we really learning from them.
  • we shouldn’t domesticate them to our own way of thinking. When we read them as self-help, we are assuming our own definition of the self and then simply picking up pieces of these ideas that fit into such a vision
  • ...11 more annotations...
  • these ideas are not about looking within and finding oneself. They are about overcoming the self. They are, in a sense, anti-self-help.
  • Today, we are often told that our goal should be to look within and find ourselves, and, once we do, to strive to be sincere and authentic to that true self, always loving ourselves and embracing ourselves for who we are. All of this sounds great and is a key part of what we think of as a properly “modern” way to live.
  • But what if we’re, on the contrary, messy selves that tend to fall into ruts and patterns of behavior? If so, the last thing we would want to be doing is embracing ourselves for who we are — embracing, in other words, a set of patterns we’ve fallen into. The goal should rather be to break these patterns and ruts, to train ourselves to interact better with those around us.
  • Certainly some strains of Chinese political theory will take this vision of the self — that we tend to fall into patterns of behavior — to argue for a more paternalistic state that will, to use a more recent term, “nudge” us into better patterns.
  • many of the texts we discuss in the book go the other way, and argue that the goal should be to break us from being such passive creatures — calling on us to do things that break us out of these patterns and allow us to train ourselves to start altering our behavior for the better.
  • You argue that Chinese philosophy views rituals as tools that can liberate us from these ruts.
  • Rituals force us for a brief moment to become a different person and to interact with those around us in a different way. They work because they break us from the patterns that we fall into and that otherwise dominate our behavior.
  • In the early Han dynasty, for example, we have examples of rituals that called for role reversals. The father would be called upon to play the son, and the son would play the father. Each is forced to see the world from the other’s perspective, with the son learning what it’s like to be in a position of authority and the father remembering what it was like to be the more subservient one
  • We tend to think that we live in a globalized world, but in a lot of ways we really don’t. The truth is that for a long time only a very limited number of ideas have dominated the world, while ideas that arose elsewhere were seen as “traditional” and not worth learning from.
  • imagine future generations that grow up reading Du Fu along with Shakespeare, and Confucius along with Plato. Imagine that type of world, where great ideas — wherever they arose — are thought about and wrestled with.
  • There’s a very strong debate going on in China about values — a sense that everything has become about wealth and power, and a questioning about whether this should be rethought. And among the ideas that are being brought into the debate are these earlier notions about the self and about how one can lead a good life. So, while the government is appropriating some of these ideas in particular ways, the broader public is debating them, and certainly with very different interpretations.
Javier E

Big Data Is Great, but Don't Forget Intuition - NYTimes.com - 2 views

  • THE problem is that a math model, like a metaphor, is a simplification. This type of modeling came out of the sciences, where the behavior of particles in a fluid, for example, is predictable according to the laws of physics.
  • In so many Big Data applications, a math model attaches a crisp number to human behavior, interests and preferences. The peril of that approach, as in finance, was the subject of a recent book by Emanuel Derman, a former quant at Goldman Sachs and now a professor at Columbia University. Its title is “Models. Behaving. Badly.”
  • A report last year by the McKinsey Global Institute, the research arm of the consulting firm, projected that the United States needed 140,000 to 190,000 more workers with “deep analytical” expertise and 1.5 million more data-literate managers, whether retrained or hired.
  • ...4 more annotations...
  • A major part of managing Big Data projects, he says, is asking the right questions: How do you define the problem? What data do you need? Where does it come from? What are the assumptions behind the model that the data is fed into? How is the model different from reality?
  • Society might be well served if the model makers pondered the ethical dimensions of their work as well as studying the math, according to Rachel Schutt, a senior statistician at Google Research. “Models do not just predict, but they can make things happen,” says Ms. Schutt, who taught a data science course this year at Columbia. “That’s not discussed generally in our field.”
  • the increasing use of software that microscopically tracks and monitors online behavior has raised privacy worries. Will Big Data usher in a digital surveillance state, mainly serving corporate interests?
  • my bigger concern is that the algorithms that are shaping my digital world are too simple-minded, rather than too smart. That was a theme of a book by Eli Pariser, titled “The Filter Bubble: What the Internet Is Hiding From You.”
Javier E

[Six Questions] | Astra Taylor on The People's Platform: Taking Back Power and Culture in the Digital Age | Harper's Magazine - 1 views

  • Astra Taylor, a cultural critic and the director of the documentaries Zizek! and Examined Life, challenges the notion that the Internet has brought us into an age of cultural democracy. While some have hailed the medium as a platform for diverse voices and the free exchange of information and ideas, Taylor shows that these assumptions are suspect at best. Instead, she argues, the new cultural order looks much like the old: big voices overshadow small ones, content is sensationalist and powered by advertisements, quality work is underfunded, and corporate giants like Google and Facebook rule. The Internet does offer promising tools, Taylor writes, but a cultural democracy will be born only if we work collaboratively to develop the potential of this powerful resource
  • Most people don’t realize how little information can be conveyed in a feature film. The transcripts of both of my movies are probably equivalent in length to a Harper’s cover story.
  • why should Amazon, Apple, Facebook, and Google get a free pass? Why should we expect them to behave any differently over the long term? The tradition of progressive media criticism that came out of the Frankfurt School, not to mention the basic concept of political economy (looking at the way business interests shape the cultural landscape), was nowhere to be seen, and that worried me. It’s not like political economy became irrelevant the second the Internet was invented.
  • ...15 more annotations...
  • How do we reconcile our enjoyment of social media even as we understand that the corporations who control them aren’t always acting in our best interests?
  • hat was because the underlying economic conditions hadn’t been changed or “disrupted,” to use a favorite Silicon Valley phrase. Google has to serve its shareholders, just like NBCUniversal does. As a result, many of the unappealing aspects of the legacy-media model have simply carried over into a digital age — namely, commercialism, consolidation, and centralization. In fact, the new system is even more dependent on advertising dollars than the one that preceded it, and digital advertising is far more invasive and ubiquitous
  • the popular narrative — new communications technologies would topple the establishment and empower regular people — didn’t accurately capture reality. Something more complex and predictable was happening. The old-media dinosaurs weren’t dying out, but were adapting to the online environment; meanwhile the new tech titans were coming increasingly to resemble their predecessors
  • I use lots of products that are created by companies whose business practices I object to and that don’t act in my best interests, or the best interests of workers or the environment — we all do, since that’s part of living under capitalism. That said, I refuse to invest so much in any platform that I can’t quit without remorse
  • these services aren’t free even if we don’t pay money for them; we pay with our personal data, with our privacy. This feeds into the larger surveillance debate, since government snooping piggybacks on corporate data collection. As I argue in the book, there are also negative cultural consequences (e.g., when advertisers are paying the tab we get more of the kind of culture marketers like to associate themselves with and less of the stuff they don’t) and worrying social costs. For example, the White House and the Federal Trade Commission have both recently warned that the era of “big data” opens new avenues of discrimination and may erode hard-won consumer protections.
  • I’m resistant to the tendency to place this responsibility solely on the shoulders of users. Gadgets and platforms are designed to be addictive, with every element from color schemes to headlines carefully tested to maximize clickability and engagement. The recent news that Facebook tweaked its algorithms for a week in 2012, showing hundreds of thousands of users only “happy” or “sad” posts in order to study emotional contagion — in other words, to manipulate people’s mental states — is further evidence that these platforms are not neutral. In the end, Facebook wants us to feel the emotion of wanting to visit Facebook frequently
  • social inequalities that exist in the real world remain meaningful online. What are the particular dangers of discrimination on the Internet?
  • That it’s invisible or at least harder to track and prove. We haven’t figured out how to deal with the unique ways prejudice plays out over digital channels, and that’s partly because some folks can’t accept the fact that discrimination persists online. (After all, there is no sign on the door that reads Minorities Not Allowed.)
  • just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there. For the past year or so, there has been a lively discussion taking place about the disproportionate and often outrageous sexual harassment women face simply for entering virtual space and asserting themselves there — research verifies that female Internet users are dramatically more likely to be threatened or stalked than their male counterparts — and yet there is very little agreement about what, if anything, can be done to address the problem.
  • What steps can we take to encourage better representation of independent and non-commercial media? We need to fund it, first and foremost. As individuals this means paying for the stuff we believe in and want to see thrive. But I don’t think enlightened consumption can get us where we need to go on its own. I’m skeptical of the idea that we can shop our way to a better world. The dominance of commercial media is a social and political problem that demands a collective solution, so I make an argument for state funding and propose a reconceptualization of public media. More generally, I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a “library” or Twitter a “town square” — or even calling social media “social” — but real public options are off the table, at least in the United States. We hand the digital commons over to private corporations at our peril.
  • 6. You advocate for greater government regulation of the Internet. Why is this important?
  • I’m for regulating specific things, like Internet access, which is what the fight for net neutrality is ultimately about. We also need stronger privacy protections and restrictions on data gathering, retention, and use, which won’t happen without a fight.
  • I challenge the techno-libertarian insistence that the government has no productive role to play and that it needs to keep its hands off the Internet for fear that it will be “broken.” The Internet and personal computing as we know them wouldn’t exist without state investment and innovation, so let’s be real.
  • there’s a pervasive and ill-advised faith that technology will promote competition if left to its own devices (“competition is a click away,” tech executives like to say), but that’s not true for a variety of reasons. The paradox of our current media landscape is this: our devices and consumption patterns are ever more personalized, yet we’re simultaneously connected to this immense, opaque, centralized infrastructure. We’re all dependent on a handful of firms that are effectively monopolies — from Time Warner and Comcast on up to Google and Facebook — and we’re seeing increased vertical integration, with companies acting as both distributors and creators of content. Amazon aspires to be the bookstore, the bookshelf, and the book. Google isn’t just a search engine, a popular browser, and an operating system; it also invests in original content
  • So it’s not that the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. After all, they are reaching farther and farther into our intimate lives. They’re watching us. Someone should be watching them.
Javier E

Social Media and the Devolution of Friendship: Full Essay (Pts I & II) » Cyborgology - 1 views

  • social networking sites create pressure to put time and effort into tending weak ties, and how it can be impossible to keep up with them all. Personally, I also find it difficult to keep up with my strong ties. I’m a great “pick up where we left off” friend, as are most of the people closest to me (makes sense, right?). I’m decidedly sub-awesome, however, at being in constant contact with more than a few people at a time.
  • the devolution of friendship. As I explain over the course of this essay, I link the devolution of friendship to—but do not “blame” it on—the affordances of various social networking platforms, especially (but not exclusively) so-called “frictionless sharing” features.
  • I’m using the word here in the same way that people use it to talk about the devolution of health care. One example of devolution of health care is some outpatient surgeries: patients are allowed to go home after their operations, but they still require a good deal of post-operative care such as changing bandages, irrigating wounds, administering medications, etc. Whereas before these patients would stay in the hospital and nurses would perform the care-labor necessary for their recoveries, patients must now find their own caregivers (usually family members or friends; sometimes themselves) to perform free care-labor. In this context, devolution marks the shift of labor and responsibility away from the medical establishment and onto the patient; within the patient-medical establishment collaboration, the patient must now provide a greater portion of the necessary work. Similarly, in some ways, we now expect our friends to do a greater portion of the work of being friends with us.
  • ...13 more annotations...
  • Through social media, “sharing with friends” is rationalized to the point of relentless efficiency. The current apex of such rationalization is frictionless sharing: we no longer need to perform the labor of telling our individual friends about what we read online, or of copy-pasting links and emailing them to “the list,” or of clicking a button for one-step posting of links on our Facebook walls. With frictionless sharing, all we have to do is look, or listen; what we’ve read or watched or listened to is then “shared” or “scrobbled” to our Facebook, Twitter, Tumblr, or whatever other online profiles. Whether we share content actively or passively, however, we feel as though we’ve done our half of the friendship-labor by ‘pushing’ the information to our walls, streams, and tumblelogs. It’s then up to our friends to perform their halves of the friendship-labor by ‘pulling’ the information we share from those platforms.
  • We’re busy people; we like the idea of making one announcement on Facebook and being done with it, rather than having to repeat the same story over and over again to different friends individually. We also like not always having to think about which friends might like which stories or songs; we like the idea of sharing with all of our friends at once, and then letting them sort out amongst themselves who is and isn’t interested. Though social media can create burdensome expectations to keep up with strong ties, weak ties, and everyone in between, social media platforms can also be very efficient. Using the same moment of friendship-labor to tend multiple friendships at once kills more birds with fewer stones.
  • sometimes we like the devolution of friendship. When we have to ‘pull’ friendship-content instead of receiving it in a ‘push’, we can pick and choose which content items to pull. We can ignore the baby pictures, or the pet pictures, or the sushi pictures—whatever it is our friends post that we only pretend to care about
  • I’ve been thinking since, however, on what it means to view our friends as “generalized others.” I may now feel like less of like “creepy stalker” when I click on a song in someone’s Spotify feed, but I don’t exactly feel ‘shared with’ either. Far as I know, I’ve never been SpotiVaguebooked (or SubSpotified?); I have no reason to think anyone is speaking to me personally as they listen to music, or as they choose not to disable scrobbling (if they make that choice consciously at all). I may have been granted the opportunity to view something, but it doesn’t follow that what I’m viewing has anything to do with me unless I choose to make it about me. Devolved friendship means it’s not up to us to interact with our friends personally; instead it’s now up to our friends to make our generalized broadcasts personal.
  • While I won’t go so far as to say they’re definitely ‘problems,’ there are two major things about devolved friendship that I think are worth noting. The first is the non-uniform rationalization of friendship-labor, and the second is the depersonalization of friendship-labor.
  • In short, “sharing” has become a lot easier and a lot more efficient, but “being shared with” has become much more time-consuming, demanding, and inefficient (especially if we don’t ignore most of our friends most of the time). Given this, expecting our friends to keep up with our social media content isn’t expecting them to meet us halfway; it’s asking them to take on the lion’s share of staying in touch with us. Our jobs (in this role) have gotten easier; our friends’ jobs have gotten harder.
  • The second thing worth noting is that devolved friendship is also depersonalized friendship.
  • Personal interaction doesn’t just happen on Spotify, and since I was hoping Spotify would be the New Porch, I initially found Spotify to be somewhat lonely-making. It’s the mutual awareness of presence that gives companionate silence its warmth, whether in person or across distance. The silence within Spotify’s many sounds, on the other hand, felt more like being on the outside looking in. This isn’t to say that Spotify can’t be social in a more personal way; once I started sending tracks to my friends, a few of them started sending tracks in return. But it took a lot more work to get to that point, which gets back to the devolution of friendship (as I explain below).
  • Within devolved friendship interactions, it takes less effort to be polite while secretly waiting for someone to please just stop talking.
  • When we consider the lopsided rationalization of ‘sharing’ and ‘shared with,’ as well as the depersonalization of frictionless sharing and generalized broadcasting, what becomes clear is this: the social media deck is stacked in such a way as to make being ‘a self’ easier and more rewarding than being ‘a friend.’
  • It’s easy to share, to broadcast, to put our selves and our tastes and our identity performances out into the world for others to consume; what feedback and friendship we get in return comes in response to comparatively little effort and investment from us. It takes a lot more work, however, to do the consumption, to sift through everything all (or even just some) of our friends produce, to do the work of connecting to our friends’ generalized broadcasts so that we can convert their depersonalized shares into meaningful friendship-labor.
  • We may be prosumers of social media, but the reward structures of social media sites encourage us to place greater emphasis on our roles as share-producers—even though many of us probably spend more time consuming shared content than producing it. There’s a reason for this, of course; the content we produce (for free) is what fuels every last ‘Web 2.0’ machine, and its attendant self-centered sociality is the linchpin of the peculiarly Silicon Valley concept of “Social” (something Nathan Jurgenson and I discuss together in greater detail here). It’s not super-rewarding to be one of ten people who “like” your friend’s shared link, but it can feel rewarding to get 10 “likes” on something you’ve shared—even if you have hundreds or thousands of ‘friends.’ Sharing is easy; dealing with all that shared content is hard.
  • t I wonder sometimes if the shifts in expectation that accompany devolved friendship don’t migrate across platforms and contexts in ways we don’t always see or acknowledge. Social media affects how we see the world—and how we feel about being seen in the world—even when we’re not engaged directly with social media websites. It’s not a stretch, then, to imagine that the affordances of social media platforms might also affect how we see friendship and our obligations as friends most generally.
grayton downing

The Stereotypes About Math That Hold Americans Back - Jo Boaler - The Atlantic - 2 views

  • Mathematics education in the United States is broken. Open any newspaper and stories of math failure shout from the pages: low international rankings, widespread innumeracy in the general population, declines in math majors. Here’s the most shocking statistic I have read in recent years: 60 percent of the 13 million two-year college students in the U.S. are currently placed into remedial math courses; 75 percent of them fail or drop the courses and leave college with no degree.
  • We need to change the way we teach math in the U.S., and it is for this reason that I support the move to Common Core mathematics.
  • One of the reasons for these results is that mathematical problems that need thought, connection making, and even creativity are more engaging for students of all levels and for students of different genders, races, and socio-economic groups. This is not only shown by my research but by decades of research in our field.
  • ...10 more annotations...
  • ways of working are critical in mathematical work and when they are taught and valued, many more students contribute, leading to higher achievement
  • mathematics education we suffer from the widespread, distinctly American idea that only some people can be “math people.” This idea has been disproved by scientific research showing the incredible potential of the brain to grow and adapt. But the idea that math is hard, uninteresting, and accessible only to “nerds” persists. 
  • harsh stereotypical thinking—mathematics is for select racial groups and men. This thinking, as well as the teaching practices that go with it, have provided the perfect conditions for the creation of a math underclass.
  • There is a good reason for this: Justification and reasoning are two of the acts that lie at the heart of mathematics. They are, in many ways, the essence of what mathematics is.  Scientists work to prove or disprove new theories by finding many cases that work or counter-examples that do not. Mathematicians, by contrast prove the validity of their propositions through justification and reasoning.
  • does not simply test a mathematical definition, as the first does. It requires that students visualize a triangle, use transformational geometry, consider whether different cases satisfy the mathematical definition, and then justify their thinking.
  • online platform explaining research evidence on ability and the brain and on good mathematics teaching, for teachers and parents. The course had a transformative effect. It was taken by 40,000 people, and 95 percent said they would change their teaching or parenting as a result.
  • The young people who are successful in today’s workforce are those who can discuss and reason about productive mathematical pathways, and who can be wrong, but can trace back to errors and work to correct them.
  • American idea that those who are good at math are those who are fast. Speed is revered in math classes across the U.S., and students as young as five years old are given timed tests—even though these have been shown to create math anxiety in young children. Parents use flash cards and other devices to promote speed, not knowing that they are probably damaging their children’s mathematical development
  • The fact of being quick or slow isn't really relevant
  • gives more time for depth and exploration than the curricula it has replaced by removing some of the redundant methods students will never need or use.
sissij

Is Empathy Overrated? | Big Think - 0 views

  • Empathy seems to be a quality you can never overdo. It’s like a megavitamin of emotionally relating: the more you display, the better a human you are.
  • In his last book, Just Babies, he argued humans are born moral, no religion required.
  • Telling someone empathy is overrated is akin to stating puppies are useless and ugly.
  • ...6 more annotations...
  • Empathy is the act of coming to experience the world as you think someone else does … If your suffering makes me suffer, if I feel what you feel, that’s empathy in the sense that I’m interested in here.
  • For example, donating to foreign charities ups our dopamine intake—we feel better because we’re making a difference (which, of course, can make it more about how we feel than who we’re helping).
  • Yet it’s not in our biological inheritance to offer unchecked empathy. Bloom points to our tribal nature as evidence. We’re going to care more for those closest to us, such as family and friends, then Cambodian orphans.
  • Anyone who thinks that it’s important for a therapist to feel depressed or anxious while dealing with depressed or anxious people is missing the point of therapy.
  • Bloom then discusses the difference between what Binghamton professor and Asian Studies scholar Charles Goodman describes as “sentimental compassion” and “great compassion.” The first is similar to empathy, which leads to imbalances in relationships and one’s own psychological state. Simply put, it’s exhausting.
  • Empathy is going to be a buzzword for some time to come. It feeds into our social nature, which Bloom sees nothing wrong with.
  •  
    I found this article very interesting as it talks about how empathy as a emotion is sometimes bad for us. I really like the point when the author mention that the empathy is not in our biological inheritance because our tribal nature is to care more for those closest to us. It is very interesting to think how our modern society shapes our emotions and behavior, and how empathy is gradually becoming our nature. --Sissi (2/22/2017)
sissij

Believe It Or Not, Most Published Research Findings Are Probably False | Big Think - 0 views

  • but this has come with the side effect of a toxic combination of confirmation bias and Google, enabling us to easily find a study to support whatever it is that we already believe, without bothering to so much as look at research that might challenge our position
  • Indeed, this is a statement oft-used by fans of pseudoscience who take the claim at face value, without applying the principles behind it to their own evidence.
  • at present, most published findings are likely to be incorrect.
  • ...6 more annotations...
  • If you use p=0.05 to suggest that you have made a discovery, you will be wrong at least 30 percent of the time.
  • The problem is being tackled head on in the field of psychology which was shaken by the Stapel affair in which one Dutch researcher fabricated data in over 50 fraudulent papers before being detected.
  • a problem know as publication bias or the file drawer problem.
  • The smaller the effect size, the less likely the findings are to be true.
  • The greater the number and the lesser the selection of tested relationships, the less likely the findings are to be true.
  • For scientists, the discussion over how to resolve the problem is rapidly heating up with calls for big changes to how researchers register, conduct, and publish research and a growing chorus from hundreds of global scientific organizations demanding that all clinical trials are published.
  •  
    As we learned in TOK, science is full of uncertainties. And in this article, the author suggests that even the publication of science paper is full of flaws. But the general population often cited science source that's in support of them. However, science findings are full of faults and the possibility is very high for the scientists to make a false claim. Sometimes, not the errors in experiments, but the fabrication of data lead to false scientific papers. And also, there are a lot of patterns behind the publication of false scientific papers.
oliviaodon

The Cult of Coincidence | The Huffington Post - 0 views

  • Most people readily believe that they themselves are essentially fully independent thinkers, and that closed-mindedness, intellectual inflexibility and an irrational commitment to pre-conceived thinking dwells only in the feeble minds of others. Think about it: When was the last time in the course of discussion that someone admitted to you something like, “You’re right, I have just blindly swallowed all of the positions and cultural mores of my milieu” or, “Yes, I agree that no amount of oppositional information will ever dissuade me from the beliefs I hold?” No one is immune from this state of affairs, and it requires courage and perpetual vigilance to even venture outside of the intellectual echo chamber that most of us inhabit.
  • There are those who believe that the scientific community is uniquely positioned to avoid these pitfalls. They suggest that the system of peer review is inherently self-critical, and as such is structurally quarantined from bias. Some scientists think otherwise and note that science, in as much as it is conducted by human beings, is subject to the same partiality as every other endeavor.
  • like the communist party under Lenin, science is [in its own eyes] infallible because its judgments are collective. Critics are unneeded, and since they are unneeded, they are not welcome.
  • ...2 more annotations...
  • A classic example of this endemic bias at work is illustrated through Einstein. He was disturbed by the implications of an expanding universe. For thousands of years it was assumed — outside of some theological circles — that matter was eternal. The notion that it came into being at a discreet point in time naturally implied that something had caused it and quite possibly that that something had done it on purpose. Not willing to accept this new information, Einstein added a now famous “fudge factor” to his equations to maintain the solid state universe that he was comfortable with — something he would later describe as “the greatest blunder of my career.”
  • If there is great resistance to notions of design and causality in science, it is exponentially greater when it comes to theology.
maxwellokolo

Stressed by Success, a Top Restaurant Turns to Therapy - 1 views

  •  
    Ms. Puig, 63, works with groups of employees, from the back of the house to the front, on every aspect of their work except the food. On a recent Tuesday morning, she led a dozen workers in a discussion about the planned remodeling of their changing rooms. Ms. Puig asked them what they wanted from the rooms. Did they need privacy when looking into the mirror?
johnsonle1

Scientists Find First Observed Evidence That Our Universe May Be a Hologram | Big Think - 1 views

  • all the information in our 3-dimensional reality may actually be included in the 2-dimensional surface of its boundaries. It's like watching a 3D show on a 2D television.
  • the team found that the observational data they found was largely predictable by the math of holographic theory. 
  • After this phase comes to a close, the Universe goes into a geometric phase, which can be described by Einstein's equations.
  • ...1 more annotation...
  • It's a new paradigm for a physical reality.
  •  
    As we watched in the video "Spooky Science" in TOK, we saw how 2D and 3D world are very distinctive, but in this article, the author discussed another theory that our 3D reality may actually be included in the 2D surface of its boundaries. This theory is a rival to the theory of cosmic inflation. The holographic theory not only explains the abnormalities, it is also a more simple theory of the early universe. Now the scientists find that the math of holographic theory can very much predict the data, so it has the potential to be a new paradigm for a physical reality. --Sissi (2/6/2017)
  •  
    What is the holographic universe idea? It's not exactly that we are living in some kind of Star Trekky computer simulation. Rather the idea, first proposed in the 1990s by Leonard Susskind and Gerard 't Hooft, says that all the information in our 3-dimensional reality may actually be included in the 2-dimensional surface of its boundaries. It's like watching a 3D show on a 2D television.
Javier E

The Duck of Minerva: Two certification systems - 0 views

  • Cohen outlined a vision of 'Net-enabled scholarly publishing that I can only think to call the aggregation model: editorial committees scanning the 'Net to find the most interesting scholarly content in a given field or discipline, and highlighting it through websites and e-mail blasts that hearken back to the early days when weblogs were literally just collections of links with one- or two-sentence summaries attached. (An example, edited by Cohen and some of his associates: Digital Humanities Now.) Some of that work consists of traditional books and articles, but much of it consists of blog posts, online debates, etc. This model gives us scholarly work from the bottom up, instead of generating published scholarly work by tossing a piece into the random crapshoot of putatively blind peer-review and crossing your fingers to see what happens. It also gives us scholarly work that can be certified as such by the collective deliberation of the community, which "votes" for pieces and ideas by reading them, recirculating them, linking to them, and other signs of interest and approval that can be easily tracked with traffic-tracing tools. And then, on top of that editorial aggregation -- Cohen made a great point that this kind of aggregation shouldn't be fully automated, because automated tools reward "loudmouths" and popular voices that just get retweeted a lot; human editors can do a lot to surface novel insights and new voices -- an open-access journal that curates the best of those linked items into published pieces, perhaps with some revisions and peer review/commentary.
Duncan H

What to Do About 'Coming Apart' - NYTimes.com - 0 views

  • Murray has produced a book-length argument placing responsibility for rising inequality and declining mobility on widespread decay in the moral fiber of white, lower-status, less well-educated Americans, putting relatively less emphasis on a similar social breakdown among low-status, less-educated Americans of all races
  • Murray’s strength lies in his ability to raise issues that center-left policy makers and academics prefer, for the most part, to shy away from. His research methods, his statistical analyses and the conclusions he draws are subject to passionate debate. But by forcing taboo issues into the public arena, Murray has opened up for discussion politically salient issues that lurk at a subterranean level in the back-brains of many voters, issues that are rarely examined with the rigor necessary to affirm or deny their legitimacy.
  • The National Review and the Conservative Monitor cited “Losing Ground” as one of the ten books that most changed America. Murray’s bookseemed like a bolt of lightning in the middle of the night revealing what should have been plain as the light of day. The welfare state so carefully built up in the 1960s and 1970s created a system of disincentives for people to better their own lives. By paying welfare mothers to have children out of wedlock into a poor home, more of these births were encouraged. By doling out dollars at a rate that could not be matched by the economy, the system encouraged the poor to stay home.
  • ...9 more annotations...
  • He contends in “Coming Apart” that there was far greater social cohesion across class lines 50 years ago because “the powerful norms of social and economic behavior in 1960 swept virtually everyone into their embrace,” adding in a Jan. 21 op-ed in the Wall Street Journal thatOver the past 50 years, that common civic culture has unraveled. We have developed a new upper class with advanced educations, often obtained at elite schools, sharing tastes and preferences that set them apart from mainstream America. At the same time, we have developed a new lower class, characterized not by poverty but by withdrawal from America’s core cultural institutions.According to Murray, higher education has now become a proxy for higher IQ, as elite colleges become sorting mechanisms for finding, training and introducing to each other the most intellectually gifted young people. Fifty years into the education revolution, members of this elite are likely to be themselves the offspring of cognitively gifted parents, and to ultimately bear cognitively gifted children.
  • “Industriousness: The norms for work and women were revolutionized after 1960, but the norm for men putatively has remained the same: Healthy men are supposed to work. In practice, though, that norm has eroded everywhere.”
  • Murray makes the case that cognitive ability is worth ever more in modern advanced, technologically complex hypercompetitive market economies. As an example, Murray quotes Bill Gates: “Software is an IQ business. Microsoft must win the IQ war or we won’t have a future.”
  • Murray alleges that those with higher IQs now exhibit personal and social behavioral choices in areas like marriage, industriousness, honesty and religiosity that allow them to enjoy secure and privileged lives. Whites in the lower social-economic strata are less cognitively able – in Murray’s view – and thus less well-equipped to resist the lure of the sexual revolution and doctrines of self-actualization so they succumb to higher rates of family dissolution, non-marital births, worklessness and criminality. This interaction between IQ and behavioral choice, in Murray’s framework, is what has led to the widening income and cultural gap.
  • Despised by the left, Murray has arguably done liberals a service by requiring them to deal with those whose values may seem alien, to examine the unintended consequences of their policies and to grapple with the political impact of assertions made by the right. He has also amassed substantial evidence to bolster his claims and at the same time elicited a formidable academic counter-attack.
  • To Murray, the overarching problem is that liberal elites, while themselves living lives of probity, have refused to proselytize for the bourgeois virtues to which they subscribe, thus leaving their less discerning fellow-citizens to flounder in the anti-bourgeois legacy of the counter-cultural 1960s.
  • “Great Civic Awakening” among the new upper class – an awakening that will lead to the kind of “moral rearmament” and paternalism characteristic of anti-poverty drives in the 19th century. To achieve this, Murray believes, the “new upper class must once again fall in love with what makes America different.”
  • The cognitive elites Murray cites are deeply committed to liberal norms of cultural tolerance and permissiveness. The antipathy to the moralism of the religious right has, in fact, been a major force driving these upscale, secular voters into the Democratic party.
  • changes in the world economy may be destructive in terms of the old social model, but they are profoundly liberating and benign in and of themselves. The family farm wasn’t dying because capitalism had failed or a Malthusian crisis was driving the world to starvation. The family farm died of abundance; it died of the rapidly rising productivity that meant that fewer and fewer people had to work to produce the food on which humanity depended.Mead continues:Revolutions in manufacturing and, above all, in communications and information technology create the potential for unprecedented abundance and a further liberation of humanity from meaningless and repetitive work. Our problem isn’t that the sources of prosperity have dried up in a long drought; our problem is that we don’t know how to swim. It is raining soup, and we are stuck holding a fork.The 21st century, Mead adds,must reinvent the American Dream. It must recast our economic, social, familial, educational and political systems for new challenges and new opportunities. Some hallowed practices and institutions will have to go under the bus. But in the end, the changes will make us richer, more free and more secure than we are now.Mead’s predictions may or may not prove prescient, but it his thinking, more than Murray’s, that reflects the underlying optimism that has sustained the United States for more than two centuries — a refusal to believe that anything about human nature is essentially “intractable.” Mead’s way of looking at things is not only more inviting than Murray’s, it is also more on target.
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atlantic - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Coursekit Raises $5 Million to Reinvent the Classroom - NYTimes.com - 0 views

  • Coursekit is a new tool that lets teachers and educators create mini social networks around individual courses and lectures.
  • The goal of the service, said Joseph Cohen, its co-founder and chief executive, is to take some of the most successful elements of social networking — especially the fluid exchange of ideas that comes natural to online interactions — to revitalize the education experience. Students are already accustomed to interacting online and supplementing their daily lives with the Web and social media. Why should that stop when it comes to learning? “Our education experience is truly offline,” he said. “We want to build what Facebook has done for your personal life, but for your school.”
  • Using Coursekit’s software, teachers can upload homework assignments, answer questions, grade work and facilitate discussions with their students. In addition, students can use the software to chat with one another, collaborate on projects and share relevant materials with their classmates
  • ...1 more annotation...
  • Coursekit is free to both the instructors and students that want to have access to it. The company says its main focus is attracting users, not making money
« First ‹ Previous 141 - 160 of 352 Next › Last »
Showing 20 items per page