Garden Rant: Forget Gen Y. Make way for Generation G. - 0 views
-
I spent a lot of time talking with and learning from gardeners from many different backgrounds and age groups who would no more hire a landscape designer than I would hire a personal stylist.
-
I feel even more strongly that many Gen Yers take a holistic approach to gardening and are comfortable reinterpreting the definition of what a garden can be. For example, their commitment to the environment, their passion for figuring things out for themselves and their tendency to rely on the internet rather than on books
-
Whether it’s trading in lawn for meadows, ornamentals for edibles or chemicals for compost, the gardening world seems more open to change and innovation than ever before.
- ...1 more annotation...
-
There are many parallels to how learning is changing and how gardening is changing. The concept of a gardener some may have as a fogey in a big floppy hat is as quaint as the concept of a knitter being an elderly lady with a cat or a professor being John Houseman in the Paper Chase. Note how younger gardeners are learning - not from books. I see this constantly. They reject the idea of manicured lawns as not only old but of questionable morals given effects on environment. They believe in eating META local. They believe in collaboration and community. This is continuing adult learning, and it's blended learning. Note where Emily Goodman got her idea for her garden design - not from a book. And guess what - it's not limited to age. Just interesting to me how stuff like this is happening in so many aspects of the world outside higher ed. I think this offers more evidence we need to keep up.
Twitter to Promote and Preserve Underrepresented Languages - 0 views
10 Award-Winning Scientific Simulation Videos - 0 views
-
This kind of visualization not scalable yet, but will it be soon? "Thanks to increasingly cheap, fast and efficient computing power, scientific simulations are now a crucial tool for researchers who want to ask once impractical scientific questions or generate data that laboratory experiments can't. "The human eye can pick out patterns in simulations that are are otherwise hard to describe, and they can do it better than any computer," said visualization scientist Joseph Insley of Argonne National Laboratory ."Plus, with the incredible amount of data gathered these days, it's difficult to analyze it any other way."
Wired Campus - The Chronicle of Higher Education - 0 views
-
"The game EteRNA, which was started by the Stanford biochemist Rhiju Das and the Carnegie Mellon computer scientist Adrien Treuille, allows researchers to farm out some of the intellectual legwork behind RNA design to 26,000 players, rather than a relatively few lab workers. Players are given a puzzle design-an RNA molecule in the shape of a star or a cross, for example-that they must fill in with the components, called nucleotides, to produce the most plausible solution. The community of players then votes for the blueprint it thinks will have the best chance of success in the lab. The Stanford researchers select the highest-rated blueprints and actually synthesize them. The scientists then report back the results of the experiments to the crowd to inform future designs. The crowd-sourcing has produced results that tend to be more effective than computer-generated arrangements. "Computational methods are not perfect in making these shapes," says Mr. Das, "and as we get to more and more complex ones, they essentially always fail, so we know that there are rules to be learned." Players are figuring out these principles on their own, says Mr. Treuille. He says that while they're more like a grandmother's instructions on baking a cake than a strict scientific formula, they work remarkably well in practice. "EteRNA players are extremely good at designing RNA's," says Mr. Treuille, "which is all the more surprising because the top algorithms published by scientists are not nearly so good. The gap is pretty dramatic.""
-
Interesting example of crowdsourcing to work on scientific issues.
Think You're An Auditory Or Visual Learner? Scientists Say It's Unlikely : Shots - Heal... - 0 views
-
When he reviewed studies of learning styles, he found no scientific evidence backing up the idea. "We have not found evidence from a randomized control trial supporting any of these," he says, "and until such evidence exists, we don't recommend that they be used." Willingham suggests it might be more useful to figure out similarities in how our brains learn, rather than differences. And, in that case, he says, there's a lot of common ground. For example, variety. "Mixing things up is something we know is scientifically supported as something that boosts attention," he says, adding that studies show that when students pay closer attention, they learn better.
Simply Speaking - Teaching and Learning with Technology - 5 views
-
Simply Speaking is a series of brief videos created by Teaching and Learning with Technology that explain technology topics in everyday language and with a little humor. They are modeled after the "... in plain english" videos that explain more general technologies such as Google Docs.
-
"Simply Speaking is a series of brief videos created by Teaching and Learning with Technology that explain technology topics in everyday language and with a little humor. They are modeled after the "... in plain english" videos that explain more general technologies such as Google Docs."
-
A new page to show all of the Simply Speaking videos that we have created over the past couple of years. Other ideas for similar videos like this are in the works, such as one to explain the importance of open educational resources and another talking about the ideas behind flipping a course.
The Twitter Trap - NYTimes.com - 3 views
-
"But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity. " "Genuine Empathy" is the one that really concerns me, and I see it in how my nieces, and others, use facebook. The FB birthday thing comes to mind...now people get as many "Happy birthday!" notes as they have friends...but are the well-wishers even thinking about my birthday? Probably not, it's just FB reminding them "Hey, it's bart's bday" and now the norm is to stop by and say "happy birthday" without even thinking about it. The end of the article has a nice quote from a novel as well: "The generation that had information, but no context. "
-
This is a well written piece. The author does a great job at tugging on our emotions. However, I believe he possesses only a superficial understanding of the medium.
DNA/How to Stop Worrying and Learn to Love the Internet - 1 views
-
I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this: 1) everything that’s already in the world when you’re born is just normal; 2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it; 3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.
-
In other words the cost of connection is rapidly approaching zero, and for a very simple reason: the value of the web increases with every single additional person who joins it. It’s in everybody’s interest for costs to keep dropping closer and closer to nothing until every last person on the planet is connected.
-
Another problem with the net is that it’s still ‘technology’, and ‘technology’, as the computer scientist Bran Ferren memorably defined it, is ‘stuff that doesn’t work yet.’ We no longer think of chairs as technology, we just think of them as chairs. But there was a time when we hadn’t worked out how many legs chairs should have, how tall they should be, and they would often ‘crash’ when we tried to use them. Before long, computers will be as trivial and plentiful as chairs (and a couple of decades or so after that, as sheets of paper or grains of sand) and we will cease to be aware of the things. In fact I’m sure we will look back on this last decade and wonder how we could ever have mistaken what we were doing with them for ‘productivity.’
- ...1 more annotation...
Tim Harford's Adapt: How to fund research so that it generates insanely great ideas, no... - 2 views
-
What did Capecchi do? He took the NIH's money, and, ignoring their admonitions, he poured almost all of it into his risky gene-targeting project. It was, he recalls, a big gamble. If he hadn't been able to show strong enough initial results in the three-to-five-year time scale demanded by the NIH, they would have cut off his funding. Without their seal of approval, he might have found it hard to get funding from elsewhere. His career would have been severely set back, his research assistants looking for other work. His laboratory might not have survived.In 2007, Mario Capecchi was awarded the Nobel Prize for Medicine for this work on mouse genes. As the NIH's expert panel had earlier admitted, when agreeing to renew his funding: "We are glad you didn't follow our advice."
-
Whichever way they sliced the data, Azoulay, Manzo and Zivin found evidence that the more open-ended, risky HHMI grants were funding the most important, unusual, and influential research. HHMI researchers, apparently no better qualified than their NIH-funded peers, were far more influential, producing twice as many highly cited research articles. They were more likely to win awards and more likely to train students who themselves won awards.
-
The HHMI researchers also produced more failures; a higher proportion of their research papers were cited by nobody at all. No wonder: The NIH program was designed to avoid failure, while the HHMI program embraced it. And in the quest for truly original research, some failure is inevitable.
Wolfram Launches PDF Killer - 0 views
How Big Can E-Learning Get? At Southern New Hampshire U., Very Big - Technology - The C... - 0 views
-
In a former textile mill in downtown Manchester, the university's president, Paul J. LeBlanc, has installed a team of for-profit veterans who help run a highly autonomous online outfit that caters to older students, with classes taught mostly by low-paid adjuncts. Their online operation is the institution's economic engine, subsidizing its money-losing undergraduate campus, known as University College, whose 2,350 students enjoy a new dining hall, Olympic-size pool, and small classes taught largely by full-time professors. "The traditional campus, in some ways, now has the resources to be even more traditional," Mr. LeBlanc says in his office on the suburban main campus, four miles from the online college. "And the nontraditional, with this split, has the ability to be even more nontraditional."
-
"It doesn't seem to me to be the 'disruptive innovation' that's going to transform things," says Richard Arum, a professor of sociology and education at New York University and one of the authors of Academically Adrift, a harsh critique of undergraduate learning. "It seems to me like just business as usual.
-
A lucrative one, too. With 7,000 online students, up from 1,700 four years ago, the College of Online and Continuing Education is on track to generate $73-million in revenues this year and more than $100-million next year. It posted a 41-percent "profit" margin in the 2011 fiscal year. The university plows the surplus into new buildings, employee salaries, financial aid at the traditional campus, and improvements in the online program.
- ...3 more annotations...
Milwaukee 7th-grader among winners in national video game design contest - JSOnline - 0 views
-
A seventh-grader from Milwaukee Montessori School is among the winners of a nationwide video game design challenge launched at the White House last fall. Shireen Zaineb created a game called "Discover.." that earned her a victory in the National STEM Video Game Challenge, which was designed to generate interest in science, technology, engineering and math, also known as STEM. Zaineb's Web-based game teaches players about concepts such as mass, friction, weight and gravity through a series of platforming challenges in which players must jump a character through 2-D environments and collect items.
What if he is right? - 2 views
-
The printing press brought about a radical change. People began getting their information primarily by seeing it -the printed word. The visual sense became dominant. Print translates one sense-hearing, the spoken word-into another sense sight, the printed word. Print also converts sounds into abstract symbols, the letters. Print is or derly progression of abstract, visual symbols. Print led to the habit of categorizing-putting everything in order, into categories, "jobs," "prices," "departments," "bureaus," "specialties." Print led, ultimately, to the creation of the modern economy, to bureaucracy, to the modern army, to nationalism itself.
-
People today think of print as if it were a technology that has been around forever. Actually, the widespread use of print is only about two hundred years old. Today new technologies-television, radio, the telephone, the computer-are causing another revolution. Print caused an "explosion"-breaking society up into categories. The electronic media, on the other hand, are causing an "implosion," forcing people back together in a tribal unity.
-
. There will be a whole nation of young psychic drop- outs-out of it-from the wealthy suburbs no less than the city slums. The thing is, all these TV-tribal children are aural people, tactile people, they're used to learning by pattern recogni tion. They go into classrooms, and there up in front of them are visual, literate, print-minded teachers. They are up there teaching classes by subjects, that is, categories; they've broken learning down into compartments -mathematics, history, geography, Latin, biology-it doesn't make sense to the tribal kids, it's like trying to study a flood by counting the trees going by, it's unnatural.
- ...3 more annotations...
7 Things You Should Know About the Modern Learning Commons | EDUCAUSE - 4 views
-
The learning commons, sometimes called an “information commons,” has evolved from a combination library and computer lab into a full-service learning, research, and project space. As a place where students can meet, talk, study, and use “borrowed” equipment, the learning commons brings together the functions of libraries, labs, lounges, and seminar areas in a single community gathering place. The cost of a learning commons can be an obstacle, but for institutions that invest in a sophisticated learning commons, the new and expanded partnerships across disciplines facilitate and promote greater levels of collaboration. The commons invites students to devise their own approaches to their work and to transfer what they learn in one course to the work they do for another.
-
This is a critical discussion today and will be more important going forward. If TLT wants to create a vision related to creating the best learning spaces in higher education we need to better understand what is and isn't working. My emerging goal is to establish a strategic direction that has us look at our spaces on a continuum from very informal to very formal in a consistent and systematic way.
-
The writing process around this particular 7 Things paper was a lot of fun. I got a real sense that what we're doing with the Media Commons spaces, especially plans for the Knowledge Commons and Ritenour are in line with the kinds of spaces being developed at other universities. There was a lot of discussion around the political side of these spaces since the physical space, staffing, and resources don't fall into a neat hierarchy of organizational structure. Anyway, I'd really enjoy being part of a discussion about space design. There are a set of recommendations that the informal learning spaces group generated two years ago that haven't been acted upon. Not that those recommendations are still the right way to go, but it's a starting point for some of the discussion: http://tlt.its.psu.edu/about/reports/2009/Learning-Spaces-Vision.pdf/view
News: 'Now You See It' - Inside Higher Ed - 2 views
-
Q: What are some of the ways that you've applied ideas and research about attention and learning in your own classroom? A: I rarely lecture anymore. I structure my classes now with each unit led by two students, who are responsible for researching and assigning texts and writing assignments and who then are charged with grading those assignments. The next week, two other students become our peer leaders. Students learn the fine art of giving and receiving feedback and learning from one another. I structure midterms as collaborative “innovation challenges,” an incredibly difficult exercise which is also the best way of intellectually reviewing the course material I’ve ever come up with. In other words, more and more I insist on students’ taking responsibility for their learning and communicating their ideas to the general public using social media.
-
If you want to learn more, you can find syllabuses and blogs on both the HASTAC and the DMLCentral site. I posted about “This Is Your Brain on the Internet” and “Twenty-First Century Literacies.” I also led a forum on interactive pedagogy in large lecture classes.
Interactive Whiteboard Meets the iPad | MindShift - 2 views
-
Kim told me he wants to enable anyone to build their own portfolio of educational content – to build hundreds of Khan Academies. That’s a goal that puts teacher- and student-generated content at the center of education, one enabled by a simple, but smoothly functioning app — all on a portable device.
-
At the same time as many educators are rethinking the hardware involved with instruction, some are rethinking other ways in technology can change the classroom. Some are experimenting with the “flipped classroom” — the idea, made quite famous lately thanks to Khan Academy, that videotaped instruction can be assigned as homework, while in-class time can be used for more personalized remediation, for collaboration among teachers and students, and for the types of exercises that have typically been seen as homework. A new app taps into both of these phenomena: bringing an interactive whiteboard-like experience to the iPad and to the Web and making it easy for iPad owners to create their own instructional videos.
-
very interesting development. we've been holding off on ipads in engineering because of a lack of streamlined screencasting workflow. I wonder if other example-heavy STEM disciplines at PSU (chem, math, stats, etc) might be interested in a pilot of some kind?
-
I'm having conversations along these lines on several fronts. I asked Hannah to look into a system that could replicate the Kahn Academy stuff. Carol McQuiggan has some faculty who are interested in the model. Chris Lucas and I may talk about it as well, related to creating open training resources. I've also brought Chris Millet into the mix because this could line up with some of the work he is doing with lecture capture (not capturing lectures per se, but a lot of the software options have the ability to let faculty create screen capture tutorials and have them automatically upload to a server along with their voice annotation.
1 - 17 of 17
Showing 20▼ items per page