Skip to main content

Home/ Educational Technology and Change Journal/ Contents contributed and discussions participated by Bonnie Sutton

Contents contributed and discussions participated by Bonnie Sutton

Bonnie Sutton

New Resources for NAEP Researchers Now Available - 0 views

Naep
started by Bonnie Sutton on 22 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    New resources are now available to help researchers use data from the National Assessment of Educational Progress (NAEP).

    * The NAEP Primer on CD-ROM guides new researchers through the technical history of NAEP and the intricacies of the NAEP database and its data tools. This 196-page guide also includes a small sample of NAEP data; an overview of this sample, its design, and implications for analysis. Order the Primer at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2011463.

    * NAEP Training at AERA's 2012 Annual Meeting - April 14.
    Register for Using NAEP Data on the Web for Educational Policy Research. The training session will be held at the American Educational Research Association annual meeting, and costs $95. For more information, go to the AERA website.

    * 2000-2008 NAEP Technical Documentation.
    Read about all technical aspects of the assessment at http://nces.ed.gov/nationsreportcard/tdw/.

    * Work With the Online Public-Use NAEP Data. These data tools include links to tutorials and other guides for analyzing NAEP data.

    * NAEP Restricted-Use Datasets through 2009.
    NAEP datasets from 1990 through 2009, and the variables in each, are available for licensed researchers at http://nces.ed.gov/nationsreportcard/researchcenter/variablesrudata.asp.

    NAEP is a product of the National Center for Education Statistics within the Institute of Education Sciences, part of the U.S. Department of Education.

    ...CONNECTING RESEARCH, POLICY AND PRACTICE

    You have received this message because you subscribed to a newsflash service through IES or one of its centers.
    Change your options or unsubscribe from this service.

    By visiting Newsflash you may also sign up to receive information from IES and its four Centers NCES, NCER, NCEE, & NCSER to stay abreast of all activities within the Institute of Education Sciences (IES).

    To obtain hard copy of many IES products as well as hard copy and electronic versions of hundreds of other U.S. Department of Education products please visit http://www.edpubs.org or call 1-877-433-7827 ( 877-4-EDPUBS ).
Bonnie Sutton

A Brief Future of Computing - 0 views

University of Edinborough Dr. Francis Wray HPC History computing supercomputing
started by Bonnie Sutton on 22 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Dr Francis Wray looks back over the history of HPC and gives his insight into what can be said about systems in the future.

    Introduction

    Over the past 30 years, computing has come to play a significant role in the way we conduct our lives. The development of PCs, enabling applications such a word processing and spreadsheets, the availability of the internet, bringing with it search engines, e-commerce, voice-over-IP and email, games consoles, and the emergence of mobile computing through laptops, tablets and smart phones, have changed forever the ways in which we interact with our family, friends and surroundings and conduct our lives. All this has happened in a very short time and with an increasing rate of change.

    As a simple indicator of this change, consider the power available in a laptop. We have chosen this simply because other mobile devices such as smart phones and tablets did not exist 20 years ago. In 1990, the first laptops typically had a 10 MHz processor, 4 MB of memory and a floating-point capability of 1 MFLOPs [1]. By the mid 1990s processor frequencies had risen to 50 MHz, memory to 32 MB and floating-point capability to 50 MFLOPs. In 2012, a typical laptop has a processor comprising two cores clocking at a frequency of 2.5 GHz, a memory of 4 Gbytes and a floating-point capability of 50 GFLOPs. In little more than 20 years, the capabilities of a laptop to process data (or play games) have increased more than a thousand-fold. This processing capability combined with a similar increase in networking capability (from the 14kb/s modem to the 20Mb/s broadband) has created unprecedented opportunities for innovation and societal change. In no other aspect of society have we seen such a rapid development.

    Compare this with the world of supercomputing. In June 1993, the world's most powerful supercomputer had a floating-point capability of 60 GFLOPs. In November 2011, the world's most powerful supercomputer had a floating-point capability of 11 PFLOPs, some 170,000 times greater. There are several things to note here. Firstly and most remarkably, in 1990 the most powerful supercomputer in the world would have had a performance of less than that of a present-day laptop. Secondly, the performance of supercomputers has increased at a somewhat faster rate than that of commodity laptops. This is not surprising because both types of computer now use similar components, but the number of processors in a supercomputer is also increasing; a few thousands in the early 90s to hundreds of thousands now. Finally, whilst the social impact of computing is clear for all to see, the industrial, and less visible, impact of supercomputing has also been very significant in areas ranging from the design of drugs to that of complete aircraft. In terms of economic effect it is clear that "The country that out-computes will be the one that out-competes." [2]

    Although, the purpose of this article is not to follow social trends and the influence computing has exerted upon them, these simple statistics show the synergy between the development of personal computing and that of supercomputing. In what follows, we will focus on the historical development of supercomputing, but bear in mind its symbiotic relationship to mobile computing. In drawing some conclusions about the future of supercomputing, we shall inevitably infer some aspects of the development of the influence computing is exerting more and more on our daily lives.

    The development of supercomputing (the early days)

    The early days of supercomputing were characterised by custom devices and exotic technologies. The first recognised supercomputer, the Cray-1, was installed at Los Alamos National Laboratory in 1976. It had a peak floating point performance of 160 MFLOPs, used a custom processor and was cooled by liquid Freon.

    This was followed by a series of developments not only by Cray, but also by Fujitsu, Hitachi and NEC who entered the supercomputer market with offerings again based on custom processors and state-of-the-art cooling systems. Examples of these include the Fujitsu VP-200, which had a peak performance of 500 MFLOPs, announced in July 1982, the Hitachi hitac S-810 which had a peak performance of 630 MFLOPs, announced in August 1982, and the NEC SX-1 vector supercomputer, announced in April 1983 which had a peak performance of 570 MFLOPs and which was the first in a line of supercomputers leading to the SX-9, announced in 2008. The SX-1 was announced at the same time as the SX-2, which had a peak performance of 1.3 GFLOPs.

    In the late 1980s, a series of massively parallel computers using large numbers of commodity processors entered the high-performance computing market. These included systems from Thinking Machines, Intel, nCube, MasPar and Meiko Scientific. In 1993, Cray announced its T3D system comprising up to 2048 Dec Alpha processors connected by a high-speed network. The potential performance of the 2048 processor system was 200 GFLOPs although the largest system ever sold had 1024 processors and a performance of 100 GFLOPs. This was a landmark announcement, which heralded the dominant position of commodity processors in computer systems ranging from mobile devices through to top-of-the range supercomputers. From then on the power of supercomputers would be determined by the performance of an individual processor and by the number that could be integrated into a single computer, limited only by power, cooling, volume, networking and financial constraints.

    The invasion of the killer micros

    From 1993 when Cray announced the T3D, it was easy to see how all computers, supercomputers included, would get faster and faster. Moore's Law told us that every 2 years or so the number of transistors in a given area of silicon would double. If these transistors could be used effectively then the performance of individual processors would increase, as would that of systems containing several such processors. When Cray introduced the T3D in 1993, it was taking advantage of a trend which had already started in the 70s, had continued through into the 80s and would continue through to around 2005. This trend, simply put, was that single processors would get faster by increasing their clock speeds and by using the increased transistor counts as dictated by Moore's Law to increase performance. In the 70s and 80s each chip generation would make a single-threaded code execute faster via a combination of increasing clock speed and by using the additional available transistors to add a major feature to the chip such as a floating-point unit, out-of-order execution, or pipelining. In the 90s and early 2000s, each chip generation would make a single-threaded code run faster via a combination of increasing clock speed and by using the additional available transistors to add an increasing number of progressively smaller features. By 2005, Moore's law still applied, but this trend had hit a wall.

    What had stopped this trend? There were two factors. The first was heating which rises with clock frequency. This has effectively limited clock speeds to less that 3 GHz. The second was that there were no more features to add to realistically speed up single-threaded code. Nevertheless, there remained a way to make use of the still increasing number of transistors as dictated by Moore's law and that was to put more than one processor core on each chip. This marked the start of the multicore revolution, which we shall describe in the next section.

    The multicore revolution

    As we have seen in the previous section, around 2005, it no longer became realistic to try to speed up single processors. Instead, the obvious step was to put increasing numbers of processor cores onto a single chip. This trend is now well established. Most laptops have dual-core processors, servers have processors with four or more cores, Intel have recently announced a processor with around 50 cores and so on. A key feature of these cores is that they are all identical and can be programmed with a homogeneous programming model, at least within the confines of a single chip.

    How long can this new trend continue? The short answer is that it is already being superseded by a move to heterogeneous multicore processors. Before we consider such devices, it is useful to look at the pros and cons of a homogeneous, multicore processor.

    The principal advantage is that the programming model for such a device is relatively simple because each core has the same instruction set and capabilities as every other core. Note the use of "relatively" because programming parallel systems comprising multiple cores can be far from simple in some cases. Clearly this type of processor is simpler to design because a single core can be replicated to fill the available area of silicon. Although the computing power of multicore devices can continue to increase rapidly by increasing the number of cores, this potential power is harder to exploit because applications need to be converted to run on parallel systems. Nevertheless, at least for small numbers of cores, support is available from proprietary parallel runtimes, profilers and debuggers.

    A disadvantage of the homogeneous multicore approach is backwards compatibility. New multicore processors need to support legacy software and in some cases this means providing redundant capability and replicating it across all cores. In other words, this approach does not result in the most efficient use of the available transistors. A further difficulty, which applies to all multicore devices, homogeneous or heterogeneous, is that of memory models. The more cores in a device, the more complicated the on-chip memory system needs to be to support each core having a consistent view of and ready access to on-chip data. Of course, this is an architectural choice. It would be perfectly possible for each core to have its own local memory and for cores to exchange data by passing messages. However, at least for the time being, manufacturers have chosen not to implement this paradigm in their mainstream processors, but may be forced to review their memory models as core counts increase. Finally, there is the issue of memory-interface bandwidth, which also applies to both heterogeneous and homogeneous multicore devices. Quite simply, the more cores there are on a device, the faster the external memory interface needs to be to keep all those cores supplied with data. This remains an unsolved problem, but one which can be circumvented for the time being while the number of cores on a device remain sufficiently small. Nevertheless, this remains a fundamental barrier to the development of processors with very large numbers of cores, both homogeneous and heterogeneous. Ultimately new architectural choices, combined with the development of new algorithms will be needed to address this issue.

    Heterogeneous multicore processors

    Only just 7 years after the start of the homogeneous multicore era, we are now confronted with compelling reasons to adopt heterogeneous, multicore processors. This approach was pioneered in the IBM Cell processor used in the PlayStation 3. A further example of such devices is the AMD Fusion, which combines multiple Central Procesing Units (CPUs) with a Graphical Processing Unit (GPU). In particular, the A10 series combines 4 CPU cores with a GPU and is targeted at the HPC marketplace. Another example is the NVIDIA Tegra, which combines a dual-core ARM Cortex A9 processor with a GeForce GPU. Although targeted at the mobile computing marketplace, it is easy to see how this technology could be further developed and applied to HPC. Yet another example is the ARM big.LITTLE processing which combines high-performance cores with low-power cores enabling an application to choose dynamically the best processor configuration and so minimise power consumption. Finally Intel's Many Integrated Core (MIC) architecture combines heavy-duty and lightweight cores on the same die, to optimise processing capabilities.

    What are the reasons for moving to heterogeneous multicore processors? It's all a question of what we are trying to optimise. Up to 2005, the objective was to get the most performance out of a single processor regardless of anything else. At the start of the homogeneous, multicore era, this objective changed to becoming one of getting the maximum performance per unit area of silicon. Now the objective has become one of getting the maximum performance per Joule, given that transistor count is no longer an issue due to the relentless advance of Moore's Law. The reasons for this last change in objective are clear. At the top end, supercomputers are consuming too much power (several megawatts) and viable systems now need to maximise compute per unit of energy. At the mobile computing end, the same constraint applies. The best way to satisfy this constraint is to perform calculations on cores, which use the least energy for those particular calculations. At the moment, the choice is limited to CPUs, GPUs and Field Programmable Gate Arrays (FPGAs), but other specialist processing cores will be deployed as heterogeneous devices mature. Indeed, chips, which integrate several diverse functional units, which can be turned off and on as needed, are already being considered.

    Data-intensive computing

    Finally, we need to add data-intensive computing to the mix. This will deploy high-performance systems optimised for the processing of data using integer and logical operations rather than for numerically intensive applications. Such systems will sit comfortably within the heterogeneous multicore landscape. They will have access to highly distributed data via the Cloud and the Internet and may even extensively use data from mobile computing devices. This will be datamining "on steroids". It will create the opportunity for the unprecedented discovery of knowledge from data enabling as yet unimagined applications, which may have profound effects on commerce and society.

    The systems of the future

    Based on current rates of progress, it is projected that exaflops (EFLOPs) systems will be available in 2019 and that zetaflops (ZFLOPs) systems will be available by 2030. Cray has already announced plans to build a 1 EFLOPs system before 2020. Somewhat astonishingly, in India, ISRO and the Indian Institute of Science have plans to build a 132.8 EFLOPs supercomputer by 2017.

    It is interesting to speculate on the processor and core count of EFLOPs systems. Focusing on the 2020 horizon for an EFLOPs system, we might anticipate the performance of a single processor to be 50 TFLOPs, for it to comprise 1000 heterogeneous cores each capable of 0.5 TFLOPs, but with only 10% of the cores running at any one time, and for a system to comprise 20,000 such processors. Such numbers are, of course, speculation and are not supported by any firm technological announcements. Indeed each of these figures may be out by a factor of 10 or even more. However, it is clear that there is significant optimism that an EFLOPs system will be feasible by 2020 and that these systems will contain millions of cores.

    What challenges lie ahead in the development and use of EFLOP systems? The development of effective memory models and interfaces is a clear priority. The development of effective programming methodologies and languages and new algorithms able to harness the new massively parallel, heterogeneous, multicore systems is essential if such systems are going to be usable. These are difficult issues, which will need to be tackled fully if the relentless increase in computer performance is to be maintained. The economic and social implications of maintaining this rate of increase are highly significant and are discussed later in this section.

    From the earlier discussions, we can see that the performance of a mobile device lags around 20 years behind that of the most powerful supercomputer equivalent to a factor in performance of around 1 million (give or take another factor of 10). That is to say that by 2020 we can expect mobile devices (laptops, tablets and smart phones) with a performance of around 1 TFLOPs. A standard laptop with a standard GPU already has a performance of 50 GFLOPs. Furthermore such a system can even now be fitted with a performant GPU card to raise its performance to 1 TFLOPs. The GPU in a typical tablet or smart phone currently performs at around 5 GFLOPs. All this clearly supports the conclusion that by 2020 we will have hand-held mobile devices capable of performing several 100 GFLOPs or more. The interesting question is to what use this performance will be put.

    To begin to answer this question, we have to look at the whole spectrum of computing. Improved network connectivity, the development of Cloud computing, the development of data-intensive computing, combined with very powerful, ubiquitous hand-held mobile devices will enable a whole new generation of applications. Highly sophisticated computer games and the wider use of crowd sourcing are obvious applications. These applications will use not only the power of the hand-held device, but will be able to interact seamlessly with Cloud-based computers including state-of-the-art supercomputers and data-intensive computers. This seamless interaction will create many new commercial opportunities with significant societal and economic impacts. It will create a new market for services, many of them HPC-based, and it will enable, as yet unimagined, new applications.

    Conclusions

    The power of all types of computing is increasing rapidly, more than doubling every two years. By 2020, it is anticipated that supercomputers will have a performance of around 1 EFLOPs, desktop systems will have a performance of up to 100 TFLOPs and hand-held devices a performance of several 100 GFLOPs. These figures are subject to a potential error of an order of magnitude, but are certainly supported by recent history. What is also clear is that increases in computer processing power will now come through increased parallelism and this will require significant changes in the ways in which computers are programmed.

    This availability of significant computing power, combined with improved network connectivity will enable a whole new generation of applications interacting seamlessly with Cloud-based computers including state-of-the-art supercomputers. These new applications will range from highly sophisticated computer games to completely new services and, as yet, unimagined applications.

    Many challenges lie ahead. The development of effective memory models and interfaces is a clear priority. The development of effective programming methodologies and algorithms able to harness the new massively parallel, heterogeneous, multicore systems is essential. If these problems can be solved, then significant economic and societal opportunities lie ahead through the exploitation of the multicore revolution.

    Dr Francis Wray is a consultant who has been involved in HPC for over 25 years. He is a Visiting Professor at the Faculty of Computing, Information Systems and Mathematics at the University of Kingston.


    [1] 1 MFLOP = 106 FLOPS (floating point operations per second); 1 GFLOP = 109 FLOPS; 1 TFLOP = 1012 FLOPS; 1 PFLOP = 1015 FLOPS; 1EFLOP = 1018 FLOPS; 1 ZFLOP = 1021 FLOPS.

    [2] http://www.isgtw.org/visualization/why-advanced-computing-matters

    A Brief Future of Computing

    Introduction
    Over the past 30 years, computing has come to play a significant role in the way we conduct our lives. The development of PCs, enabling applications such a word processing and spreadsheets, the availability of the internet, bringing with it search engines, e-commerce, voice-over-IP and email, games consoles, and the emergence of mobile computing through laptops, tablets and smart phones, have changed forever the ways in which we interact with our family, friends and surroundings and conduct our lives. All this has happened in a very short time and with an increasing rate of change.

    As a simple indicator of this change, consider the power available in a laptop. We have chosen this simply because other mobile devices such as smart phones and tablets did not exist 20 years ago. In 1990, the first laptops typically had a 10 MHz processor, 4 MB of memory and a floating-point capability of 1 MFLOPs [1]. By the mid 1990s processor frequencies had risen to 50 MHz, memory to 32 MB and floating-point capability to 50 MFLOPs. In 2012, a typical laptop has a processor comprising two cores clocking at a frequency of 2.5 GHz, a memory of 4 Gbytes and a floating-point capability of 50 GFLOPs. In little more than 20 years, the capabilities of a laptop to process data (or play games) have increased more than a thousand-fold. This processing capability combined with a similar increase in networking capability (from the 14kb/s modem to the 20Mb/s broadband) has created unprecedented opportunities for innovation and societal change. In no other aspect of society have we seen such a rapid development.

    Compare this with the world of supercomputing. In June 1993, the world's most powerful supercomputer had a floating-point capability of 60 GFLOPs. In November 2011, the world's most powerful supercomputer had a floating-point capability of 11 PFLOPs, some 170,000 times greater. There are several things to note here. Firstly and most remarkably, in 1990 the most powerful supercomputer in the world would have had a performance of less than that of a present-day laptop. Secondly, the performance of supercomputers has increased at a somewhat faster rate than that of commodity laptops. This is not surprising because both types of computer now use similar components, but the number of processors in a supercomputer is also increasing; a few thousands in the early 90s to hundreds of thousands now. Finally, whilst the social impact of computing is clear for all to see, the industrial, and less visible, impact of supercomputing has also been very significant in areas ranging from the design of drugs to that of complete aircraft. In terms of economic effect it is clear that "The country that out-computes will be the one that out-competes." [2]

    Although, the purpose of this article is not to follow social trends and the influence computing has exerted upon them, these simple statistics show the synergy between the development of personal computing and that of supercomputing. In what follows, we will focus on the historical development of supercomputing, but bear in mind its symbiotic relationship to mobile computing. In drawing some conclusions about the future of supercomputing, we shall inevitably infer some aspects of the development of the influence computing is exerting more and more on our daily lives.

    The development of supercomputing (the early days)

    The early days of supercomputing were characterised by custom devices and exotic technologies. The first recognised supercomputer, the Cray-1, was installed at Los Alamos National Laboratory in 1976. It had a peak floating point performance of 160 MFLOPs, used a custom processor and was cooled by liquid Freon.

    This was followed by a series of developments not only by Cray, but also by Fujitsu, Hitachi and NEC who entered the supercomputer market with offerings again based on custom processors and state-of-the-art cooling systems. Examples of these include the Fujitsu VP-200, which had a peak performance of 500 MFLOPs, announced in July 1982, the Hitachi hitac S-810 which had a peak performance of 630 MFLOPs, announced in August 1982, and the NEC SX-1 vector supercomputer, announced in April 1983 which had a peak performance of 570 MFLOPs and which was the first in a line of supercomputers leading to the SX-9, announced in 2008. The SX-1 was announced at the same time as the SX-2, which had a peak performance of 1.3 GFLOPs.

    In the late 1980s, a series of massively parallel computers using large numbers of commodity processors entered the high-performance computing market. These included systems from Thinking Machines, Intel, nCube, MasPar and Meiko Scientific. In 1993, Cray announced its T3D system comprising up to 2048 Dec Alpha processors connected by a high-speed network. The potential performance of the 2048 processor system was 200 GFLOPs although the largest system ever sold had 1024 processors and a performance of 100 GFLOPs. This was a landmark announcement, which heralded the dominant position of commodity processors in computer systems ranging from mobile devices through to top-of-the range supercomputers. From then on the power of supercomputers would be determined by the performance of an individual processor and by the number that could be integrated into a single computer, limited only by power, cooling, volume, networking and financial constraints.

    The invasion of the killer micros

    From 1993 when Cray announced the T3D, it was easy to see how all computers, supercomputers included, would get faster and faster. Moore's Law told us that every 2 years or so the number of transistors in a given area of silicon would double. If these transistors could be used effectively then the performance of individual processors would increase, as would that of systems containing several such processors. When Cray introduced the T3D in 1993, it was taking advantage of a trend which had already started in the 70s, had continued through into the 80s and would continue through to around 2005. This trend, simply put, was that single processors would get faster by increasing their clock speeds and by using the increased transistor counts as dictated by Moore's Law to increase performance. In the 70s and 80s each chip generation would make a single-threaded code execute faster via a combination of increasing clock speed and by using the additional available transistors to add a major feature to the chip such as a floating-point unit, out-of-order execution, or pipelining. In the 90s and early 2000s, each chip generation would make a single-threaded code run faster via a combination of increasing clock speed and by using the additional available transistors to add an increasing number of progressively smaller features. By 2005, Moore's law still applied, but this trend had hit a wall.

    What had stopped this trend? There were two factors. The first was heating which rises with clock frequency. This has effectively limited clock speeds to less that 3 GHz. The second was that there were no more features to add to realistically speed up single-threaded code. Nevertheless, there remained a way to make use of the still increasing number of transistors as dictated by Moore's law and that was to put more than one processor core on each chip. This marked the start of the multicore revolution, which we shall describe in the next section.

    The multicore revolution

    As we have seen in the previous section, around 2005, it no longer became realistic to try to speed up single processors. Instead, the obvious step was to put increasing numbers of processor cores onto a single chip. This trend is now well established. Most laptops have dual-core processors, servers have processors with four or more cores, Intel have recently announced a processor with around 50 cores and so on. A key feature of these cores is that they are all identical and can be programmed with a homogeneous programming model, at least within the confines of a single chip.

    How long can this new trend continue? The short answer is that it is already being superseded by a move to heterogeneous multicore processors. Before we consider such devices, it is useful to look at the pros and cons of a homogeneous, multicore processor.

    The principal advantage is that the programming model for such a device is relatively simple because each core has the same instruction set and capabilities as every other core. Note the use of "relatively" because programming parallel systems comprising multiple cores can be far from simple in some cases. Clearly this type of processor is simpler to design because a single core can be replicated to fill the available area of silicon. Although the computing power of multicore devices can continue to increase rapidly by increasing the number of cores, this potential power is harder to exploit because applications need to be converted to run on parallel systems. Nevertheless, at least for small numbers of cores, support is available from proprietary parallel runtimes, profilers and debuggers.

    A disadvantage of the homogeneous multicore approach is backwards compatibility. New multicore processors need to support legacy software and in some cases this means providing redundant capability and replicating it across all cores. In other words, this approach does not result in the most efficient use of the available transistors. A further difficulty, which applies to all multicore devices, homogeneous or heterogeneous, is that of memory models. The more cores in a device, the more complicated the on-chip memory system needs to be to support each core having a consistent view of and ready access to on-chip data. Of course, this is an architectural choice. It would be perfectly possible for each core to have its own local memory and for cores to exchange data by passing messages. However, at least for the time being, manufacturers have chosen not to implement this paradigm in their mainstream processors, but may be forced to review their memory models as core counts increase. Finally, there is the issue of memory-interface bandwidth, which also applies to both heterogeneous and homogeneous multicore devices. Quite simply, the more cores there are on a device, the faster the external memory interface needs to be to keep all those cores supplied with data. This remains an unsolved problem, but one which can be circumvented for the time being while the number of cores on a device remain sufficiently small. Nevertheless, this remains a fundamental barrier to the development of processors with very large numbers of cores, both homogeneous and heterogeneous. Ultimately new architectural choices, combined with the development of new algorithms will be needed to address this issue.

    Heterogeneous multicore processors

    Only just 7 years after the start of the homogeneous multicore era, we are now confronted with compelling reasons to adopt heterogeneous, multicore processors. This approach was pioneered in the IBM Cell processor used in the PlayStation 3. A further example of such devices is the AMD Fusion, which combines multiple Central Procesing Units (CPUs) with a Graphical Processing Unit (GPU). In particular, the A10 series combines 4 CPU cores with a GPU and is targeted at the HPC marketplace. Another example is the NVIDIA Tegra, which combines a dual-core ARM Cortex A9 processor with a GeForce GPU. Although targeted at the mobile computing marketplace, it is easy to see how this technology could be further developed and applied to HPC. Yet another example is the ARM big.LITTLE processing which combines high-performance cores with low-power cores enabling an application to choose dynamically the best processor configuration and so minimise power consumption. Finally Intel's Many Integrated Core (MIC) architecture combines heavy-duty and lightweight cores on the same die, to optimise processing capabilities.

    What are the reasons for moving to heterogeneous multicore processors? It's all a question of what we are trying to optimise. Up to 2005, the objective was to get the most performance out of a single processor regardless of anything else. At the start of the homogeneous, multicore era, this objective changed to becoming one of getting the maximum performance per unit area of silicon. Now the objective has become one of getting the maximum performance per Joule, given that transistor count is no longer an issue due to the relentless advance of Moore's Law. The reasons for this last change in objective are clear. At the top end, supercomputers are consuming too much power (several megawatts) and viable systems now need to maximise compute per unit of energy. At the mobile computing end, the same constraint applies. The best way to satisfy this constraint is to perform calculations on cores, which use the least energy for those particular calculations. At the moment, the choice is limited to CPUs, GPUs and Field Programmable Gate Arrays (FPGAs), but other specialist processing cores will be deployed as heterogeneous devices mature. Indeed, chips, which integrate several diverse functional units, which can be turned off and on as needed, are already being considered.

    Data-intensive computing

    Finally, we need to add data-intensive computing to the mix. This will deploy high-performance systems optimised for the processing of data using integer and logical operations rather than for numerically intensive applications. Such systems will sit comfortably within the heterogeneous multicore landscape. They will have access to highly distributed data via the Cloud and the Internet and may even extensively use data from mobile computing devices. This will be datamining "on steroids". It will create the opportunity for the unprecedented discovery of knowledge from data enabling as yet unimagined applications, which may have profound effects on commerce and society.

    The systems of the future

    Based on current rates of progress, it is projected that exaflops (EFLOPs) systems will be available in 2019 and that zetaflops (ZFLOPs) systems will be available by 2030. Cray has already announced plans to build a 1 EFLOPs system before 2020. Somewhat astonishingly, in India, ISRO and the Indian Institute of Science have plans to build a 132.8 EFLOPs supercomputer by 2017.

    It is interesting to speculate on the processor and core count of EFLOPs systems. Focusing on the 2020 horizon for an EFLOPs system, we might anticipate the performance of a single processor to be 50 TFLOPs, for it to comprise 1000 heterogeneous cores each capable of 0.5 TFLOPs, but with only 10% of the cores running at any one time, and for a system to comprise 20,000 such processors. Such numbers are, of course, speculation and are not supported by any firm technological announcements. Indeed each of these figures may be out by a factor of 10 or even more. However, it is clear that there is significant optimism that an EFLOPs system will be feasible by 2020 and that these systems will contain millions of cores.

    What challenges lie ahead in the development and use of EFLOP systems? The development of effective memory models and interfaces is a clear priority. The development of effective programming methodologies and languages and new algorithms able to harness the new massively parallel, heterogeneous, multicore systems is essential if such systems are going to be usable. These are difficult issues, which will need to be tackled fully if the relentless increase in computer performance is to be maintained. The economic and social implications of maintaining this rate of increase are highly significant and are discussed later in this section.

    From the earlier discussions, we can see that the performance of a mobile device lags around 20 years behind that of the most powerful supercomputer equivalent to a factor in performance of around 1 million (give or take another factor of 10). That is to say that by 2020 we can expect mobile devices (laptops, tablets and smart phones) with a performance of around 1 TFLOPs. A standard laptop with a standard GPU already has a performance of 50 GFLOPs. Furthermore such a system can even now be fitted with a performant GPU card to raise its performance to 1 TFLOPs. The GPU in a typical tablet or smart phone currently performs at around 5 GFLOPs. All this clearly supports the conclusion that by 2020 we will have hand-held mobile devices capable of performing several 100 GFLOPs or more. The interesting question is to what use this performance will be put.

    To begin to answer this question, we have to look at the whole spectrum of computing. Improved network connectivity, the development of Cloud computing, the development of data-intensive computing, combined with very powerful, ubiquitous hand-held mobile devices will enable a whole new generation of applications. Highly sophisticated computer games and the wider use of crowd sourcing are obvious applications. These applications will use not only the power of the hand-held device, but will be able to interact seamlessly with Cloud-based computers including state-of-the-art supercomputers and data-intensive computers. This seamless interaction will create many new commercial opportunities with significant societal and economic impacts. It will create a new market for services, many of them HPC-based, and it will enable, as yet unimagined, new applications.

    Conclusions

    The power of all types of computing is increasing rapidly, more than doubling every two years. By 2020, it is anticipated that supercomputers will have a performance of around 1 EFLOPs, desktop systems will have a performance of up to 100 TFLOPs and hand-held devices a performance of several 100 GFLOPs. These figures are subject to a potential error of an order of magnitude, but are certainly supported by recent history. What is also clear is that increases in computer processing power will now come through increased parallelism and this will require significant changes in the ways in which computers are programmed.

    This availability of significant computing power, combined with improved network connectivity will enable a whole new generation of applications interacting seamlessly with Cloud-based computers including state-of-the-art supercomputers. These new applications will range from highly sophisticated computer games to completely new services and, as yet, unimagined applications.

    Many challenges lie ahead. The development of effective memory models and interfaces is a clear priority. The development of effective programming methodologies and algorithms able to harness the new massively parallel, heterogeneous, multicore systems is essential. If these problems can be solved, then significant economic and societal opportunities lie ahead through the exploitation of the multicore revolution.

    Dr Francis Wray is a consultant who has been involved in HPC for over 25 years. He is a Visiting Professor at the Faculty of Computing, Information Systems and Mathematics at the University of Kingston.


    [1] 1 MFLOP = 106 FLOPS (floating point operations per second); 1 GFLOP = 109 FLOPS; 1 TFLOP = 1012 FLOPS; 1 PFLOP = 1015 FLOPS; 1EFLOP = 1018 FLOPS; 1 ZFLOP = 1021 FLOPS.

    [2] http://www.isgtw.org/visualization/why-advanced-computing-matters

    © 2012 The University of Edinburgh
Bonnie Sutton

Transforming Higher Education with IT - 2 views

  • Bonnie Sutton
  • Bonnie Sutton
     
    WEDNESDAY, MARCH 28, 2012
    9:00 AM - 10:30 AM

    Information Technology and Innovation Foundation
    1101 K Street NW
    (Suite 610A)
    Washington,
    DC
    20005

    President Obama recently announced a new initiative to address the question of higher education affordability. For decades, the cost of college has been growing faster than inflation, putting college out of reach for an increasing number of American families. However, if we are going to keep college cost increases down, information technology will have to play a key role, just as it has in a host of other industries. There are administrative costs and inefficiencies in higher education that are doing little to improve the quality of education but do make it less accessible to more students at a time when the opposite should be happening. ITIF will explore how IT can address these issues, looking at the experiences in various institutions and states and exploring what can be done at a federal level to improve productivity in higher education.

    PARTICIPANTS:

    Steve Crawford
    Research Professor,
    The George Washington Institute of Public Policy (GWIPP)
    Presenter
    Stephen Ruth
    Professor of Public Policy,
    George Mason University
    Presenter
    Robert D. Atkinson
    President,
    Information Technology and Innovation Foundation
    Moderator
Bonnie Sutton

Call for Chapters Common Core Mathematics Standards and Implementation of Digital Techn... - 0 views

mathematical educators educational technology common core standards
started by Bonnie Sutton on 20 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    http://www.igi-global.com/authorseditoCommon Core Mathematics Standards and Implementing Digital Technologies
    9
    Editors:
    Drew Polly, Ph.D. University of North Carolina at Charlotte, USA

    Call for Chapters:
    Proposals Submission Deadline: March 9, 2012
    Full Chapters Due: June 1, 2012
    Submission Date: December 15, 2012

    Introduction

    Forty-five states in the USA have adopted the Common Core State Standards in both Mathematics and English Language Arts. There is a growing demand for various educational stakeholders to provide resources on the Common Core State Standards in Mathematics. Both assessment consortia, Smarter Balance and PARCC, are developing technology-based assessments that will be administered in each state starting in 2014-2015. This book will include contributions from both educational technologists and mathematics educators providing examples of how digital technologies can support the implementation of the Common Core State Standards in Mathematics.



    Objective of the Book

    The objectives of this book are to:
    1) Disseminate information about current digital technologies and how they can support the implementation of the Common Core State Standards in Mathematics.

    2) Provide concrete examples appropriate for both K-12 education leaders and university leaders about how digital technologies can support the implementation of the Common Core State Standards in Mathematics.


    Target Audience

    The target audience of this book is educational leaders in the field of both Mathematics Education and Educational Technology. These include higher education faculty, educational researchers, and leaders at the State and District level, as well as others interested in the implementation of the Common Core State Mathematics Standards.


    Recommended topics include, but are not limited to the following:

    · Supporting the Common Core State Mathematics Standards (CCSSM)
    o How can technology support the implementation of the Content (Grade level) Standards?
    o How can technology support learners' work with the Standards for Mathematical Practice?
    · Evidence of Impact or Cases from the Field
    o Research reports or vignettes describing how technology has supported students' mathematical understanding
    · Assessment
    o What might assessment of the CCSSM look like in the 21st Century?
    o How can technology support student assessment regarding the CCSSM?
    o How can technology support teachers' decision making related to the CCSSM?
    · Technological Pedagogical and Content Knowledge (TPACK)
    o How does the TPACK framework influence how technology can support the implementation of the CCSSM?
    o How does TPACK research influence the way we should support teachers and school districts' implementation of the CCSSM?
    · Contemporary Technologies Supporting the CCSSM
    o How can contemporary technologies support the implementation of the CCSSM?
    § Including: Handheld technologies, Web 2.0 tools, virtual manipulatives, simulations, dynamic geometry software (SketchPad, Geogebra), Statistics software (Fathom), Graphing Calculators, Data collection technologies (Probes, Sensors, etc.)
    · Issues and Implications
    o What educational issues and implications in regard to technologies must be addressed as we consider the implementation of the CCSSM?
    o What are the most pressing implications for practice in regard to technology and the CCSSM?
    o What are the most pressing implications for research in regard to technology and the CCSSM?

    Submission Procedure

    Interested authors are invited to submit on or before March 9, 2012, a 2-3 page chapter proposal clearly explaining the mission and focus of his or her proposed chapter. Authors of accepted proposals will be notified by March 19, 2012 about the status of their proposals and sent chapter guidelines. Full chapters are expected to be submitted by June 1, 2012. All submitted chapters will be reviewed on a double-blind review basis. Contributors may also serve as reviewers for this project.

    Publisher

    This book is scheduled to be published by IGI Global (formerly Idea Group Inc.), publisher of the "Information Science Reference" (formerly Idea Group Reference), "Medical Information Science Reference," "Business Science Reference," and "Engineering Science Reference" imprints. For additional information regarding the publisher, please visit www.igi-global.com. This book is anticipated to be released in 2012.



    Important Dates
    One page proposal- Due March 9, 2012
    Decisions- To authors by March 19, 2012
    Full chapters due- June 1, 2012
    Revisions to authors- July 1, 2012
    Revised/final chapters- August 15, 2012

    Inquiries and submissions can be forwarded electronically (Word document):
    Drew Polly
    ccssm.technology@gmail.com

    rs/authoreditorresources/callforbookchapters/callforchapterdetails.aspx?callforcontentid=2c930e5a-184e-48fe-badd-bf8b9bb1238b
Bonnie Sutton

Brown Center Report on American Education - 2 views

Achievement Gaps NAEP Common Core
started by Bonnie Sutton on 16 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    The 2012 Brown Center Report on American Education


    View online: http://www.brookings.edu/newsletters/browncenter/2012/0216.aspx
    ► Brookings.edu
    February 16, 2012


    Brown Center on Education Policy
    The 2012 Brown Center Report on American Education
    Today, we released the 2012 Brown Center Report on American Education. Researched and written by senior fellow Tom Loveless since 2000, the report uses empirical evidence to explore important questions in education policy and analyzes the state of American education, with a special emphasis on student learning measures, achievement test scores trends, and education reform outcomes.

    Here's a brief summary of the three studies in this year's report:
    Predicting the Effect of the Common Core State Standards on Student Achievement: The Common Core will have little to no effect on student achievement. The quality or rigor of state standards has been unrelated to state NAEP scores, Loveless finds. Moreover, most of the variation in NAEP scores lies within states, not between them. Whatever impact standards alone can have on reducing within-state differences should have already been felt by the standards that all states have had since 2003.

    Measuring Achievement Gaps on NAEP: The Main NAEP consistently reports larger SES achievement gaps than the Long Term Trend NAEP. The study examines gaps between students who qualify for free and reduced lunch and those who do not; black and white students; Hispanic and white students; and English language learners and students who are not English language learners. Loveless writes that, "The biggest discrepancy between the tests is with ELL students. That suggests that the role language plays on the two tests-which is quite different, even in math-may be influencing the magnitude of the gaps."

    Misinterpreting International Test Scores: Educators & policymakers often misinterpret International Test Scores in three ways: 1) Dubious Conclusions of Causality, 2) The Problem With Rankings, and 3) The A+ Country Fallacy. The errors are usually committed by advocates of a particular policy position who selectively use data to support an argument, argues Loveless. Dubious Causal Conclusions-refers to attributing a change in test scores to a single policy change. The case of Poland is used to illustrate. It accomplished large gains on the PISA reading test. The theory that tracking reform produced the gains is not supported by the evidence. The Problem with Rankings-shows how rankings can distort a nation's relative standing by exaggerating small changes in test scores or, the reverse, making large changes appear less significant than they really are. The A+ Country Fallacy-refers to the habit of pointing to high performing countries and assuming that their policies must be good.
    Download the full report (PDF) »
    View past Brown Center reports »

    In a related video, Tom Loveless provides an overview of the 2012 Brown Center Report on American Education's key findings and conclusions.
Bonnie Sutton

the whole child - 1 views

whole child school reform
started by Bonnie Sutton on 09 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    "It is a miracle that curiosity survives formal education." -- Einstein Learn what's really happening in the world of education with veteran education writer Valerie Strauss and her guest


    Public education's problems just got worse

    More to Read





    No Child Left Behind
    Teachers
    The Answer Sheet



    Posted at 04:00 AM ET, 02/09/2012
    Taking a stand for 'the whole child' approach to school reform
    By Valerie Strauss

    This was written by Sean Slade, director of Healthy School Communities, a program of the ASCD, an educational leadership organization.

    By Sean Slade

    The White House recently launched their We The People initiative that gives Americans a "new way to petition the Obama administration to take action on a range of important issues facing our country." At ASCD, we are taking President Obama up on his offer and are asking people to sign our petition to make "a whole child approach" to education reform a national priority.

    The White House has a National Security Council, a Council on Environmental Quality, the Council of Economic Advisors, the Council on Women and Girls, and the Council on Jobs and Competitiveness. It's time for a President's Council on the Whole Child.

    A whole child approach to education enhances learning by addressing each student's social, emotional, physical, and academic needs through the shared contributions of schools, families, communities, and policymakers. It is a move away from education policy that far too narrowly focuses on student standardized test scores as the key school accountability measure and that has resulted in the narrowing of curriculum as well as rigid teaching and learning environments.

    The true measure of student success is much more than a test score, and ensuring that young people achieve in and out of school requires support well beyond effective academic instruction. The demands of the 21st century require a new approach to education to fully prepare our nation's youth for college, career, and citizenship.

    Our last two Vision in Action Award Winners, Price Laboratory School (PLS) in Iowa and Quest Early College High School in Texas, exemplify what we mean. Both of these schools work to ensure that each child is healthy, safe, supported, engaged and challenged, whether it is through foundation of daily physical education for all grades K-12; or the weekly health programs promoting empowerment, fresh and organic foods, as is the case at Price Lab; or yearlong personal wellness plans, and a focus on social/emotional as well as physical health at Quest

    http://www.washingtonpost.com/blogs/answer-sheet/post/taking-a-stand-for-the-whole-child-approach-to-school-reform/2012/02/05/gIQARBcM0Q_blog.html

    Lessons and projects extend outside the classroom walls and into the local community. They are adapted to engage students and reworked to provide for personal learning styles and interest. Advisory groups - or "families" as they are called at Quest - abound and are a crucial part in making each teacher, student and family feel respected. And in both schools all are expected to achieve and are provided the mechanisms to do so. They don't just set the bar high. They provide the steps and supports to get over that bar.

    Both schools have gone beyond just a vision for educating the whole child to actions that result in learners who are knowledgeable, emotionally and physically healthy, civically active, artistically engaged, prepared for economic self-sufficiency, and ready for the world beyond formal schooling.

    But this ideal should not be found only in the the occasional school. It should be found in all schools.

    ASCD therefore asks the Obama administration to establish a President's Council on the Whole Child to attend to the comprehensive needs of students. Such a council would:

    * Comprise educators, community members, state officials, national leaders, and other experts who would provide the president with expert counsel about how to coordinate the education, health, and social service sectors in support of our nation's youth.

    * Seek to reduce silo-ization of purpose and funding across these agencies and services, and promote collaborative efforts towards a common goal. Current federal programs and offices that address education, health, and safety of students too often function in isolation - and sometimes with contrary goals.

    * Strategically consolidate and coordinate programs and services for children that have been assembled in an ad hoc fashion over the decades, facilitating greater collaboration among the education, social, health, and safety agencies that support children.

    * Highlight the mutual obligation that educators and non-educators alike have for supporting the whole child and improving student achievement.

    * Involve and engage community members, organizations and agencies in working together towards a common and commonly desired goal.

    . Education - as has been highlighted by the president in his last two State of the Union addresses - is a building block for our country's future success and has impact on the economy, the environment, and national security.

    No one would argue that we need a relevant, personalized and meaningful education system. But we won't get there with a short-term focus on proficiency in reading and math. Instead we need to address the broad array of factors influencing long-term success required of students after high school graduation.

    It is time to put the whole child into focus and to develop systems, processes and policies that promote growth and development. W

    If you think a child's worth is more than a test score, sign ASCD's petition to create a President's Council on the Whole Child

    Follow The Answer Sheet every day by bookmarking http://www.washingtonpost.com/blogs/answer-sheet.
Bonnie Sutton

The Digital Divide - 1 views

infographic digital divide technology spread costs Internet access
started by Bonnie Sutton on 09 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Posted by Drew Hendricks on Feb 8th, 2012 //

    The current Internet revolution provides amazing opportunities for entry-level professionals, college students, and entrepreneurs, but as the Infographic below shows, it is leaving a number of America's financially disadvantaged in its wake.

    In looking at the presented income divides, racial divides, city vs. rural divides, and U.S. vs. world divides, we can see huge disparities in accessibility and affordability of broadband Internet. We are presented with some statistics that seem unbelievable to most of us in the Internet industry. If the current trend continues to grow as it is, do you think technology of the future will spawn an elitist, new generation of savvy users, while the rest of the nation becomes extinct in the tech world? Some research suggests that while the technology evolves, the ease of access and use continues to adjust to the target market - iPads have been great learning tools for young as well as senior citizens, so if the continuation of new developments are accessible enough to a wide consumer-base, the learning curve will not be too severe. It's just a matter of costs.

    Despite unprecedented Internet growth, 100,000,000 US homes still lack broadband access, and this 'digital divide,' falls along several lines. In 2010, computer and Internet use and household income were compared. Over 50% of those with an annual income of less than $25,000 had no Internet access. By contrast, an income of $100,000 or more left only about 5% without Internet access. Additionally, only 55% of black homes even have access to Internet.

    When we look at the differences in the cost of Internet access in the US and abroad, it's easy to see why accessibility is so limited to the disadvantaged. America is the world's richest company, but ranks only 12th for Internet access. Verizon FiOS in the US costs 5X as much as it does in Paris, and 96% of Americans have access to only one or two providers. This limits competition, and allows the providers to charge rates that are inaccessible to America's poor. Check it out for yourself.

    Digital Divide
    http://z6mag.com/technology/the-digital-divide-165125.html
Bonnie Sutton

Obama requests funding to help math, science teacher preparation - 2 views

White House Fair Funds for and Preparation math gaming project science education
started by Bonnie Sutton on 08 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    From staff and wire reports
    Read more by staff and wire services reports

    http://www.eschoolnews.com/2012/02/08/obama-requests-funding-to-help-math-science-teacher-preparation/

    President Obama launches a marshmallow from a cannon designed by 14-year-old Joey Hudy at the White House Science Fair Feb. 7.
    President Barack Obama on Feb. 7 called for millions of dollars in new funding to improve math and science education, an effort he said would be crucial to the nation's long-term success.

    Obama said his upcoming budget proposal, set to be released next week, would include a request for $80 million from Congress for a new Education Department competition to support math and science teacher preparation programs. Obama made a similar request to Congress last year, but the measure didn't pass.

    Separately, he announced $22 million in new investments from the private sector to support math and science efforts. Among the organizations committing fresh funding are Google and the Carnegie Corporation of New York.

    Obama said a renewed focus on math and science education should be an American imperative.

    "The belief that we belong on the cutting edge of innovation, that's an idea as old as America itself," Obama said. "We're a nation of thinkers, dreamers, believers in a better tomorrow."

    Obama has set a goal of preparing more than 100,000 math and science teachers and training a million additional science, technology, engineering, and math (STEM) graduates over the next decade.

    For more news about STEM education, see:

    $3 million gaming project could help spark STEM education

    Inquiry-based approach to science a hit with students

    Climate change skepticism seeps into classrooms

    Seeking to highlight the benefits of math and science education, Obama hosted a White House science fair earlier on Feb. 7, featuring projects designed by more than 100 students from across the country. The projects included a robot that helps senior citizens connect with their families via Skype and a portable disaster relief shelter that could be used to house people who have been displaced from their homes.

    "It's not every day you have robots running all over your house," Obama said of the science fair. "I'm trying to figure out how you got through the metal detectors."

    The president also mischievously helped fire an eighth-grader's award winning high-speed marshmallow air cannon at the drapes of the White House's elegant State Dining Room, noting: "The Secret Service is going to be mad at me about this."
Bonnie Sutton

Who really benefits from putting high-tech gadgets in classrooms? - 2 views

Julius Genachowski digital playbook learning ecosystems textbooks
started by Bonnie Sutton on 07 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    ******************************
    From The Los Angeles Times, Saturday, February 4, 2012. See http://www.latimes.com/business/la-fi-hiltzik-20120205,0,639053.column .. Our appreciation to Monty Neil, Executive Director, Fair Test, for bringing this article to our attention.
    ******************************
    Who really benefits from putting high-tech gadgets in classrooms?

    How much genuine value is there in fancy educational electronics? Don't let companies or politicians fool you.

    By Michael Hiltzik
    Something sounded familiar last week when I heard U.S. Education Secretary Arne Duncan and FCC Chairman Julius Genachowski make a huge pitch for infusing digital technology into America's classrooms.
    Every schoolchild should have a laptop, they said. Because in the near future, textbooks will be a thing of the past.

    Where had I heard that before? So I did a bit of research, and found it. The quote I recalled was, "Books will soon be obsolete in the schools.... Our school system will be completely changed in 10 years."

    The revolutionary technology being heralded in that statement wasn't the Internet or the laptop, but the motion picture. The year was 1913, and the speaker, Thomas Edison, was referring to the prospect of replacing book learning with instruction via the moving image.

    He was talking through his hat then, every bit as much as Duncan and Genachowski are talking through theirs now.

    Here's another similarity: The push for advanced technology in the schoolroom then and now was driven by commercial, not pedagogical, considerations. As an inventor of motion picture technology, Edison stood to profit from its widespread application. And the leading promoter of the replacement of paper textbooks by e-books and electronic devices today is Apple, which announced at a media event last month that it dreams of a world in which every pupil reads textbooks on an iPad or a Mac.
    That should tell you that the nirvana sketched out by Duncan and Genachowski at last week's Digital Learning Day town hall was erected upon a sizable foundation of commercially processed claptrap [see http://wpc.1806.edgecastcdn.net/001806/aee/aee020111.html ]. Not only did Genachowski in his prepared remarks give a special shout out to Apple and the iPad, but the event's roster of co-sponsors included Google, Comcast, AT&T, Intel and other companies hoping to see their investments in Internet or educational technologies pay off [see below or go to http://www.latimes.com/business/la-fi-hiltzik-20120205,0,639053.column t0 download the talk].

    How much genuine value is there in fancy educational electronics? Listen to what the experts say.
    "The media you use make no difference at all to learning," says Richard E. Clark, director of the Center for Cognitive Technology at USC. "Not one dang bit. And the evidence has been around for more than 50 years." [see http://www-bcf.usc.edu/~clark/ [
    Almost every generation has been subjected in its formative years to some "groundbreaking" pedagogical technology. In the '60s and '70s, "instructional TV was going to revolutionize everything," recalls Thomas C. Reeves, an instructional technology expert at the University of Georgia. [see http://it.coe.uga.edu/~treeves/ ] "But the notion that a good teacher would be just as effective on videotape is not the case."
    Many would-be educational innovators treat technology as an end-all and be-all, making no effort to figure out how to integrate it into the classroom. "Computers, in and of themselves, do very little to aid learning," Gavriel Salomon of the University of Haifa and David Perkins of Harvard observed in 1996. [see attachment or go to http://www.latimes.com/business/la-fi-hiltzik-20120205,0,639053.column to download information] Placing them in the classroom "does not automatically inspire teachers to rethink their teaching or students to adopt new modes of learning."

    At last week's dog-and-pony show, Duncan bemoaned how the U.S. is being outpaced in educational technology by countries such as South Korea and even Uruguay. ("We have to move from being a laggard to a leader" was his sound bite.)

    Does Duncan ever read his own agency's material? In 2009, the Education Department released a study of whether math and reading software helped student achievement in first, fourth, and sixth grades, based on testing in hundreds of classrooms. The study found that the difference in test scores between the software-using classes and the control group was "not statistically different from zero." [See attachment or go to http://www.latimes.com/business/la-fi-hiltzik-20120205,0,639053.column to download] In sixth-grade math, students who used software got lower test scores - and the effect got significantly worse in the second year of use.

    The aspect of all this innovative change that got the least attention from Duncan and Genachowski was how school districts are supposed to pay for it.

    It's great to suggest that every student should be equipped with a laptop or given 24/7 access to Wi-Fi, but shouldn't our federal bureaucrats figure out how to stem the tidal wave of layoffs in the teaching ranks and unrelenting cutbacks in school programs and maintenance budgets first? School districts can't afford to buy enough textbooks for their pupils, but they're supposed to equip every one of them with a $500 iPad?

    "There are two big lies the educational technology industry tells," says Reeves. "One, you can replace the teacher. Two, you'll save money in the process. Neither is borne out."
    Apple has become a major purveyor of the mythology of the high-tech classroom. "Education is deep in our DNA," declared Phil Schiller, Apple's marketing chief, at its Jan. 19 education event. "We're finding that as students are starting to be introduced to iPad and learning, some really remarkable things are happening." [see http://events.apple.com.edgesuite.net/1201oihbafvpihboijhpihbasdouhbasv/event/index.html ]

    If you say so, Phil. But it's proper to point out the downside to one great innovation Schiller touted, a desktop publishing app called iBooks Author. The app is free, and plainly can help users create visually striking textbooks. But buried in the user license is a rule that if you sell a product created with iBooks Author, you can sell it only through Apple's iBookstore, and Apple will keep 30% of the purchase price. (Also, your full-featured iBook will be readable only on an Apple device such as an iPad.)
    Among tech pundits, the reaction to this unusual restriction has ranged from citing its "unprecedented audacity" [http://venomousporridge.com/post/16126436616/ibooks-author-eula-audacity] to calling it "mind-bogglingly greedy and evil." [see http://www.zdnet.com/blog/bott/apples-mind-bogglingly-greedy-and-evil-license-agreement/4360 ] Apple won't comment for the record on the uproar. Whatever you think of it, the rule makes clear that Apple's interest in educational innovation is distinctly mercantile. But that didn't keep Genachowski from praising Apple's education initiative as an "important step." (Perhaps he meant a step toward enhanced profitability.)
    Of course Apple draped its new business initiative in all sorts of Steve Jobsian pixie dust, as if it's all about revolutionizing education. The company's most amusing claim is that iPads are somehow more "durable" than textbooks and therefore more affordable, over time. Its website weeps copious crocodile tears over the sad fate of textbooks [http://www.apple.com/education/ibooks-textbooks/ ] - "as books are passed along from one student to the next, they get more highlighted, dog-eared, tattered and worn." Yet as James Kendrick of ZDNet reports, school administrators who have handed laptops out to students to take home say the devices get beaten nearly to death in no time. [http://www.zdnet.com/blog/mobile-news/one-thing-may-rock-the-apple-ipad-for-education-scheme-kids/6505 ] The reality is obvious: Drop a biology textbook on a floor, you pick it up. Drop an iPad, you'll be sweeping it up.

    Some digital textbooks may have advantages over their paper cousins. Well-produced multimedia features can improve students' understanding of difficult or recondite concepts. But there's a fine line between an enhancement and a distraction, and if textbook producers are using movies and 3-D animations to paper over the absence of serious research in their work, that's not progress.

    Nor is it a given that e-books will be cheaper than bound books, especially when the cost of the reading devices is factored in. Apple tries to entice schools to buy iPads in blocks of 10 by offering a lavish discount of, well, $20 per unit. They still cost $479 each. The company also provides a bulk discount on extended warranties for the device, but - surprise! - it doesn't cover accidental damage from drops or spills.

    Apple and its government mouthpieces speak highly of the ability to feed constant updates to digital textbooks so they never go out of date. But that's relevant to a rather small subset of schoolbooks such as those dealing with the leading edge of certain sciences - though I'm not sure how many K-12 pupils are immersed in advanced subjects such as quantum mechanics or string theory. The standard text of "Romeo and Juliet," on the other hand, has been pretty well locked down since 1599.

    There's certainly an important role for technology in the classroom. And the U.S. won't benefit if students in poor neighborhoods fall further behind their middle-class or affluent peers in access to broadband Internet connectivity or computers. But mindless servility to technology for its own sake, which is what Duncan and Genachowski are promoting on behalf of self-interested companies like Apple, will make things worse, not better.

    That's because it distracts from and sucks money away from the most important goal, which is maintaining good teaching practices and employing good teachers in the classroom. What's scary about the recent presentation by Duncan and Genachowski is that it shows that for all their supposed experience and expertise, they've bought snake oil. They're simply trying to rebottle it for us as the elixir of the gods.
    ----------------------------------
    SIDEBAR: Arne Duncan and Julius Genachowski discuss technology in education.
    ---------------------------
    PHOTO SIDEBAR: U.S. Education Secretary Arne Duncan, left, and FCC Chairman Julius Genachowski speak at a Digital Learning Day event sponsored in part by Google, Comcast, AT&T and Intel. (Mark Wilson, Getty Images / February 5, 2012)
    ----------------------------------
    Michael Hiltzik's column appears Sundays and Wednesdays. Reach him at mhiltzik@latimes.com, read past columns at latimes.com/hiltzik, check out facebook.com/hiltzik and follow @latimeshiltzik on Twitter.
    ****************************************

    PREPARED REMARKS AT DIGITAL LEARNING DAY TOWN HALL
    CHAIRMAN JULIUS GENACHOWSKI
    FEDERAL COMMUNICATIONS COMMISSION
    THE NEWSEUM
    WASHINGTON, D.C.
    FEBRUARY 1, 2012

    Thank you all for joining us here at the Newseum and online across the country.

    I want to thank everyone who is stepping up to seize the opportunities of digital learning,
    in particular Governor Wise and the Alliance for Excellent Education for hosting today's
    event.

    Thank you Xavier for being here and to AT&T for their participation in this initiative.
    I'd like to also thank all the private sector partners of the Digital Textbook Collaborative,
    Josh Gottheimer and Jordan Usdan on my team at the FCC, and Karen Cator at the
    Department of Education.

    And finally, thank you to Secretary Arne Duncan for your leadership at the Department
    of Education and your partnership on digital learning.

    At the FCC, our mission is to harness the power of broadband and communications
    technology to improve the lives of the American people.

    Few areas hold more promise for broadband-enabled innovation and improvements than
    education.

    For example, broadband enables distance learning and collaboration, connecting students
    wherever they are to information, tutors, teachers, and other students.

    Studies show technology makes a real difference. Technology-based teaching can reduce
    the time it takes to learn a lesson by 30 to 80 percent.

    The FCC has been working since the early days of the commercial Internet to bring the
    benefits of online learning to America's schools.

    Our E-Rate program - established in the 1990s - has helped connect almost every
    classroom in America to the Internet. And we recently modernized our E-Rate program
    to seize the opportunities of mobile connectivity.

    Now, it's time for the next stage - or chapter if you will - in education technology:
    digital textbooks. Digital textbooks are one of the cornerstones of digital learning.
    When we talk about transitioning to digital textbooks, we're not just talking about giving
    students e-readers so they no longer have to carry around backpacks filled with 50
    pounds of often out-of-date textbooks.

    We're talking about students having interactive learning devices that can offer lessons
    personalized to their learning style and level, and enable real-time feedback to parents,
    teachers, or tutors.

    Imagine a student who has trouble doing his geometry homework; the digital textbook
    automatically inserts a supplemental lesson.

    Imagine a teacher who has instant access to the results of a pop quiz; she can immediately
    see that four of her students didn't understand the concept of photosynthesis and is able
    to offer an extra lesson.

    We've seen digital textbooks adopted in pockets around the country, but adoption is not
    widespread and too skewed to wealthier areas.

    Meanwhile, too many students still have textbooks that are 7 to 10 years old. And some
    students are using history books that don't even cover 9/11.

    It's not just the content of textbooks that needs updating, it's the concept. We often talk
    about how technology has changed everything, but static, hardcover textbooks are what I
    used in school, what my parents used, what their parents used and so on.

    We spend $7 billion a year on textbooks in this country, but digital textbooks - this
    massive innovation - remain the exception, not the rule.

    We can do better. And I envision a society spending less on textbooks, but getting more
    out of them.

    We all win if the players in the digital learning ecosystem - including publishers, device
    manufacturers, platform providers, internet service providers, schools - work together to
    accelerate the adoption of digital textbooks. If they work together to address the
    obstacles of broadband deployment and adoption, content development, interoperability
    and device costs.

    To date, many of these players have taken important steps.

    Costs of the tablets are coming down and many content players, including the longstanding
    incumbents and new entrants, are working to transition to a digital world.

    Connectivity is still an obstacle. About a third of Americans - 100 million people - still
    haven't adopted broadband at home. Digital textbooks can't work without this home
    connectivity.

    We've launched a major public-private initiative called Connect to Compete, to promote
    broadband adoption, and we've seen major companies like Microsoft, Best Buy, and the
    cable companies step forward with significant commitments to promote adoption.

    I commend Comcast for their continued commitment to broadband adoption. The
    extension of their Internet Essentials program to thousands of additional families will
    help bring the benefits of broadband to more students and families.

    Just yesterday, the FCC approved a measure to modernize our Lifeline program -
    establishing a Broadband Adoption Pilots to begin transitioning Lifeline from a program
    that supports phone service to one that supports Internet access.

    Apple took an important step last month with its announcement of new textbooks and a
    publishing suite for the iPad.

    Recognizing the need to do more, the FCC partnered with the Department of Education
    to convene the Digital Textbook Collaborative.

    The collaborative enlisted partners from across the digital learning ecosystem to compile
    best practices on how schools can go digital.

    I'm pleased that today the Collaborative is releasing a Digital Learning Playbook.
    This Playbook offers information about how to achieve the robust connectivity that is
    necessary for digital learning inside and outside school. It also details important
    considerations when implementing digital devices. Finally, it identifies the best practices
    for making a successful transition to digital learning.

    This is an important start but there's more work to do. Countries like South Korea, which
    has announced they are going to all digital textbooks in 2013, are still ahead of us.

    If we want American students to be the best prepared to compete in the 21st century
    global economy, we can't allow a majority of our students to miss out on the
    opportunities of digital textbooks.

    Today, I want to challenge everyone in the space - companies, government officials,
    schools and teachers - to do their part to make sure that every student in America has a
    digital textbook in the next five years.

    Next month, Secretary Duncan and I will be convening a meeting with CEOs in the
    digital textbook spaces to work toward achieving this goal.

    Digital learning is critical to the future of education in our country and to our global
    competiveness. I'm pleased that we're taking steps to seize the opportunities before us
Bonnie Sutton

How real school reform should look (or explaining water to a fish) - 1 views

school reform not a standard body of knowledge political paralysis education change
started by Bonnie Sutton on 06 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    How real school reform should look (or explaining water to a fish)
    By Valerie Strauss
    http://www.washingtonpost.com/blogs/answer-sheet/post/how-real-school-reform-should-look-or-explaining-water-to-a-fish/2012/02/04/gIQAHrQNpQ_blog.html

    This was written by Marion Brady, veteran teacher, administrator, curriculum designer and author. This first appeared on truth-out.org .

    By Marion Brady

    Imagine the present corporately promoted education reform effort as a truck, its tires nearly flat from the weight of the many unexamined assumptions it carries.

    On board: An assumption that punishment and rewards effectively motivate; that machines can measure the quality of human thought; that learning is hard, unpleasant work; that what the young need to know is some agreed-upon, standard body of knowledge; that doing more rigorously what we've always done will raise test scores; that teacher talk and textbook text can teach complex ideas; that ... well, you get the idea.

    Misdiagnosing the Main Problem

    Right now, the biggest, heaviest assumption on the reform truck has it that, when the Common Core State Standards Initiative is complete - when somebody has decided exactly what every kid in every state is supposed to know in every school subject at every grade level - the education reform truck will take off like gangbusters.

    It won't. If all the reformers' flawed assumptions are corrected, but the traditional math-science-language-arts-social-studies "core curriculum" remains the main organizer of knowledge, the truck may creep forward a few inches, but it won't take the young where they need to go if we care about societal survival. The mess from this generation's political paralysis and refusal to address looming problems can't be cleaned up using the same education that helped create it.

    What's wrong with "the core?" For its content to be processed, stored in memory, retrieved and combined in novel ways to create new knowledge, it would have to be well organized and integrated. It isn't. It's a confusing, random, overwhelming, intellectually unmanageable assortment of facts, specialized vocabularies, disconnected conceptual frameworks, and abstractions - the whole too far removed from life as the young live it for them to care about it.

    So, they don't. They're being blasted with information at fire-hose velocity. The diligent and the fearful store as much as they can in short-term memory, and when testing is over, their brains delete what's considered clutter because it's not immediately useful. The non-diligent and the cynical guess and/or cheat on the answer sheets. The rest (and their numbers, understandably, are steadily increasing) opt out of the trivia game, or are opted out by thoughtful, caring parents.

    A Different Organizer

    There's an alternative to the core as an organizer of information and knowledge. We use it from birth to death, and we didn't learn it in school. It's the key to an order-of-magnitude improvement in learner performance.

    For firsthand evidence of that system's potential, consider how much we learn and how fast we learn it long before we walk through school doors. Starting from scratch, we figure out how to meet personal needs; learn what's acceptable and unacceptable behavior; construct explanatory theories; master one or more complex languages; adapt appropriately to many different personality types; absorb the foundational patterns of action and premises of one or more cultures; and much, much else.

    Our "natural" knowledge-organizing and integrating system's main components are those we use to create the most complete and sophisticated models of reality known - stories. To make sense of any and all reality, we seek answers to just five questions - Who? What? When? Where? Why? All knowledge is an elaboration of one or more of those five distinct kinds of information.

    "We did something," communicates.

    "Because we were bored, Tanya and I went to the mall yesterday," elaborates.

    "Because Tanya Jones and I, Mary Smith, were bored, we went to Bath and Body Works in Eastland Mall in Columbus, Ohio, arriving in the parking lot a few minutes after three o'clock on the 13th of January," elaborates further.

    The exercise could continue, adding layers of increasing precision.

    The more we know about a particular subject, situation or science, the more elaborations we have from which to choose. When Hippocrates wrote about cancer in 400 BC, he almost certainly didn't see it as a group of diseases, each sufficiently different from the others to warrant the range of labels that help today's researchers and doctors think and talk about cancer more productively. As cancer research advances, the elaborating process will continue.

    We make sense by choosing from elaborating options for who, what, when, where and why, and weaving our choices together systemically. As options increase and potential systemic relationships multiply, ever-better sense is made, creativity is stimulated and knowledge expands.

    An Unknown Known

    Our sense-making system - like the concept of gravity before Sir Isaac Newton - is so familiar we don't think of it as a system. And, when it's pointed out, we tend to dismiss it as too simple and obvious to be important, much less the key to educational transformation. But made explicit and put to work, our implicitly known knowledge organizer moves learner performance to levels far beyond the reach of the measurement capabilities of standardized tests, including the ones on which international comparisons are based.

    Skillful use of the system can't be taught in the usual sense of the word - can't, that is, be transferred in useable form from mind to mind by words on a page, images on a screen or lectures from a stage. Learners have to construct understanding for themselves.

    To appreciate the teaching-learning challenge, imagine trying to explain water to a fish. Success requires that the utterly familiar be made "strange enough to see." A five-hour lecture to a fish on the subject of water wouldn't match the memorable experience of being lifted out of the water for a five-second exposure to air.

    Experience is the best teacher, but attention must be paid. Adolescents, encouraged to look long and hard at particular, ordinary experiences - and to think and talk about what they're doing - eventually discover their basic, five-element approach to sense-making. They've lived long enough to have experiences they can analyze, are mature enough to examine those experiences introspectively and haven't yet been programmed by schooling to sort what they know into disconnected boxes with subject-matter labels.

    Reasoning their way to those five distinct kinds of information, they "own" the foundation of their knowledge-categorizing and - manipulating system. No reading from a textbook, no listening to a lecture, no viewing of a video production, will ever match the level of understanding of ideas that emerge from firsthand experience refined by dialogue.

    The Challenge of Change

    Making deliberate use of our usual system for organizing knowledge doesn't discard academic disciplines or the school subjects based on them. It elevates and enhances them; puts them in context; and makes them mutually supportive, systemically integrated parts of each learner's seamless "model of reality" - the mental template laid down on particular experience to generate questions leading to the making of sense.

    Ironically, it's probable that use of the system would perpetuate the curse of standardized testing. When kids know how their mental "filing systems" work, and make use of them to retrieve trivia from memory, scores will go up. But the long-term positives of using familiar school subjects and procedures to smooth the change process cancel the negatives, primarily by allowing the process to be evolutionary rather than revolutionary.

    Eventually, as making more sense of experience replaces the ubiquitous "preparing for college and career" as the working aim of schooling, broader change will follow. Coming (as it should) "bottom up," from teachers and learners focused on improving sense-making rather than on raising test scores, the direction of change will always be appropriate.

    There will be surprises, but they'll be pleasant. A major one will be the discovery that kids are far smarter than they're given credit for being. Another will be that adequately feeding the left, order-seeking side of the brain takes much less time than is currently being devoted to it. A third related surprise will be that the time thus released will make possible a world of useful educational activities - projects, apprenticeships, advanced studies, and so on. A fourth will be that a much better education can be had for considerably less money.

    To begin to make use of our natural system for making sense, a little handholding should help. A rough, first-generation tool for that purpose titled Connections: Investigating Reality (think of it as a beta version) can be downloaded from the Internet. In the spirit of "open source," and acknowledging a deep-seated American aversion to spending public money on educating, it's free to individual educators for use with students.

    Connections requires no special training, no additional materials and no new technology. It does, however, require teachers or mentors who are willing to play a non-traditional role. Present textbooks and teacher talk offer learners secondhand, supposedly expert knowledge about reality. Connections directs attention to reality itself in all its inherent complexity, and poses questions or problems. Particular realities may be as mundane as the arrangement of furniture in a classroom, a familiar television commercial, a popular children's book, an obscure folk song. The young - less programmed by life experience - may see in them what the teacher does not. Sufficient humility to accept that fact and encourage its demonstration is appropriate.

    Connections makes provision for user dialogue. If advantage is taken of the tool, the differing perspectives and collective wisdom of teachers and learners will allow the general education curriculum to continuously adapt to the needs and trends of the era.

    On the Other Hand

    When the CEOs and the politicians they've bought finish the simplistic "reform" they've started, when the claim that an order-of-magnitude improvement in learner intellectual performance has been dismissed as hyperbole, when all 50 states have been pressured to adopt the regressive Common Core Standards locking the knowledge-fragmenting 1893 curriculum in permanent place, when standardized subject-matter tests that can't measure the qualities and quality of thought have been nationalized, when the "standards and testing police" are fully deployed and looking over every teacher's shoulder, it'll all be over. America and the nations that follow its lead in education will face a dynamic world equipped with a static curriculum.

    Catastrophe will be inevitable.

    -30-

    Follow The Answer Sheet every day by bookmarking http://www.washingtonpost.com/blogs/answer-sheet .
Bonnie Sutton

Copyright: Reaching Out to Teachers and Students - 1 views

Library of congress teachers and students US Copyright Office
started by Bonnie Sutton on 06 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Copyright: Reaching Out to Teachers and Students
    February 2nd, 2012 by Cheryl Lederle

    http://blogs.loc.gov/teachers/2012/02/copyright-reaching-out-to-teachers-and-students/

    This is a guest post from David Christopher, Chief, Information and Records Division, U.S. Copyright Office.

    When I was young - and I'm not that old - the term "copyright" and its curious symbol, ©, seemed a quaint holdover from a bygone era. It was for me a fuzzy legal term that book publishers thought highly enough of to place on the verso of the title page of every book I ever picked up, but it certainly had no real impact on me or my life.

    Boy, have things changed. The Internet, coupled with smart phones, tablets and all of the other wonderful gadgets we use to create, share and enjoy creative works, gives each of us the power to engage in infringing activities, whether knowingly or not, literally on a global scale. Copyright is indeed a hot topic these days (note the reaction to the proposed SOPA/PIPA anti-piracy legislation in online media) and, if anything, copyright matters will only increase in the public consciousness in the years to come.

    Given the increased relevance of copyright in the digital age, the U.S. Copyright Office, located here at the Library of Congress, recognizes the need to engage more proactively in public education and outreach. Last October, Maria A. Pallante, Register of Copyrights and Director of the Copyright Office, released a list of Priorities and Special Projects that the Copyright Office will engage in over the 2011-2013 period. Notable among the special projects, we are currently in the early stages of developing a business plan for a robust copyright education and outreach program.

    The goal of this effort is to implement a series of new education projects tailored to a variety of audiences including librarians, teachers, artists, copyright practitioners, and the general public. While we already offer online educational resources designed for teachers and students like Taking the Mystery Out of Copyright and the professional development modules Copyright and Primary Sources and Understanding Copyright, we want to offer more.

    Look for additional news from the Copyright Office, including guest posts on this blog, announcing program developments and initiatives in the coming months. In the meantime, please contact us with questions or suggested copyright-related topics of particular interest to teachers through our Contact Us page.
    http://www.loc.gov/teachers/copyrightmystery/
Bonnie Sutton

Pathways to Prosperity Report , Harvard University - 1 views

College for all pathways to prosperity Harvard University
started by Bonnie Sutton on 06 Feb 12 no follow-up yet
Bonnie Sutton

PLAYBACK: News on Teens and Blogs, Facebook, Twitter and Google+, And Schools That Don'... - 0 views

Facebook. Twitter and Google+. Parent's Guide to Facebook News on Teens Blogs
started by Bonnie Sutton on 05 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Spotlight on Digital Media and Learning


    PLAYBACK: News on Teens and Blogs, Facebook, Twitter and Google+, And Schools That Don't Allow Them

    Posted: 03 Feb 2012 02:22 PM PST

    In this week's PLAYBACK, blogging is better than diary writing in relieving stress, a new Parent's Guide to Facebook, S. Craig Watkins on what kids miss out on when schools block social media, and more.

    ---

    Filed by Christine C.

    Blogging is better than diary writing in relieving stress, a new Parent's Guide to Facebook, S. Craig Watkins on what kids miss out on when schools block social media, and more ...

    Blogging as Therapy: A new study published in the journal Psychological Services has concluded that blogging openly about the trials and tribulations of teenage life offers even greater therapeutic value than keeping a personal diary.

    The study, based on 161 high school students in Israel-124 girls and 37 boys, with an average age of 15-found writing and engaging with an online community was most effective in relieving stress. While the study's authors acknowledge the skewed sex ratio was a limitation in the study, they found boys and girls responded similarly.

    "Research has shown that writing a personal diary and other forms of expressive writing are a great way to release emotional distress and just feel better," the study's lead author, Meyran Boniel-Nissim, PhD, of the University of Haifa, Israel, said in a press release from the American Psychological Association. "Teens are online anyway, so blogging enables free expression and easy communication with others."

    "Although cyberbullying and online abuse are extensive and broad, we noted that almost all responses to our participants' blog messages were supportive and positive in nature," said the study's co-author, Azy Barak. "We weren't surprised, as we frequently see positive social expressions online in terms of generosity, support and advice."

    One 17-year-old blogger from Norfolk, Va., who did not participate in the study told The New York Times: "People will write in the comments, 'I remember when I was in your shoes' and 'Don't worry - you'll get through the SATs!' and it's wonderful. It really helps put everything into perspective."

    Plus: In the article "Coming of Age Online: The Developmental Underpinnings of Girls' Blogs," Katie Davis, a doctoral student at Harvard Graduate School of Education, interviewed 20 young women, ages 17 to 21, who have been blogging for at least three years. Putting her study in the context of other studies of youths' online activities, Davis found that the changes in the content and style of the blogs themselves reflect the long-understood changes in social and cognitive development in youth. Read more about girls carving out their own space online.

    Social Media as an Integral Part of Adolescence: More than a century ago, the telephone was seen as a technological threat to the social order. "Men would be calling women and making lascivious comments, and women would be so vulnerable, and we'd never have civilized conversations again," Megan Moreno, a specialist in adolescent medicine at the University of Wisconsin-Madison, tells Perri Klass, a physician who writes for The New York Times.

    Sound familiar? While much of the early research on adolescent use of social media tended to focus on potential dangers, Klass notes that there now seems to be a more nuanced understanding of how teenagers are spending time online, and research reflects that.

    "We should not view social media as either positive or negative, but as essentially neutral," said Michael Rich, a pediatrician and the director of the Center on Media and Child Health at Children's Hospital Boston. "It's what we do with the tools that decides how they affect us and those around us."

    Klass, who travels back and forth between the worlds of academic pediatrics and academic journalism, writes that she is "struck by the focus in both settings on the potential - and the risks - of social media and on the importance of understanding how communication is changing." Continue reading her thoughtful assessment.

    Plus: Concerns about social media paving a direct pathway to bad behavior may also be alleviated by the latest University of Michigan Monitoring the Future study. The report, which has tracked teenage risk behaviors since 1975, shows that adolescent use of alcohol, tobacco and most illegal drugs is far lower than it was three decades ago. Here's the overview of key findings (pdf).

    A New York Times column based on the report, "The Kids are More Than All Right," will appear in this weekend's Sunday Magazine.

    "Nobody knows exactly why sex and drug use has declined among teenagers, but there are a number of compelling possibilities that may have contributed," writes Tara Parker-Pope. "The last three decades have included a rise in the drinking age to 21; a widespread fear of H.I.V.; and legal challenges that stymied tobacco marketing. And while cellphones and Facebook have created new ways for teenagers to stir up trouble, they may also help parents monitor their children. Still, today's children have found ways to rebel (think energy drinks and sexting) that aren't tracked in national surveys."

    Everything You Wanted to Know About Facebook and Were Afraid to Ask: If you're a parent struggling to understand your child's use of Facebook, this 34-page Parent's Guide to Facebook, just released by ConnectSafely.org is, for you.

    The 2012 version features updated information about Facebook's Timeline, privacy controls and social reporting. Read Anne Collier's post to learn more.

    Teens and Twitter: While some teenagers are adding Twitter to the mix of social networks they use, they're not giving up Facebook to do it. That's the gist of reactions to this AP story, which suggested teens were migrating from one social space to the other. As Emil Protalinski writes at ZDNet:

    The report quotes findings from the Pew Internet & American Life Project, a nonprofit organization that monitors people's tech-based habits, to prove its point. Specifically, the data shows 8 percent of teenagers used Twitter in late 2009, and the number doubled to 16 percent in July 2011.

    That's not the whole story though. In July 2011, Pew also some questions to just teenagers who maintain social media accounts. Taking this slightly smaller number of users, Pew split them into two groups: those who have one social media account, and those who have multiple social media accounts. The numbers there were much more telling: of teenagers who use just one social network, 89 percent are using Facebook. Less than 1 percent are using just Twitter. Of teenagers who have more than one social network, 99 percent are using Facebook, and 29 percent are using Twitter as well.

    Plus: Now that Google+ has opened up to teens age 13 and older, hanging out may take on a whole other meaning.

    Here's the Google+ Teen Safety Guide. One of the interesting features, writes Neil Vidyarthi, is the Hangout kickout: "If a teenager is in a video/audio Hangout and a person enters the chat who is not in one of their circles, the teenager is booted out of the Hangout until they either add the person to their network. Not exactly an impenetrable solution, but a small gesture to educate teenagers and one that parents may appreciate."

    When Schools Block Social Media: S. Craig Watkins nudges the debate about social media use in schools by adding his insight, based on the past year spent working in a high school, about what schools end up blocking when they block social media. In one word: opportunity.

    Watkins, author of "The Young and the Digital: What the Migration to Social Network Sites, Games, and Anytime, Anywhere Media Means for Our Future" (Beacon, 2009), describes working with a teacher in a technology application class on using digital media to tell compelling stories about the challenges of teen life. But the kids can't critique PSA narrative styles and strategies because they can't access YouTube. They can't work during class on a Facebook poll to build "user personas" of their peers because they can't get on Facebook. And not all of them have internet access at home.

    We are learning a lot about how young people from this community, which has been hit especially hard by the recession and the growing wealth gap in the United States, are managing their participation in the digital world. The old theories about the digital divide-the access narrative-only explain a small part of what is happening in edge communities.

    The real issue, of course, is not social media but learning. Specifically, the fact that our schools are disconnected from young learners and how their learning practices are evolving.

    Read the rest here.
    http://spotlight.macfound.org/blog/entry/news-on-teens-and-blogs-facebook-twitter-google-and-schools/?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+macfound%2FiQaL+Spotlight+on+Digital+Media+and+Learning#When:22:22:00Z
Bonnie Sutton

Want your kid to be a scientist? Start in elementary school. - 0 views

Kids as scientists Khan academy elementary school the starting point for science cartoons at site.
started by Bonnie Sutton on 04 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Want your kid to be a scientist? Start in elementary school.

    By Priya Natarajan, Published: February 2

    "What's your major?" Ask a college freshman this question, and the answer may be physics or chemistry. Ask a sophomore or a junior, however, and you're less likely to hear about plans to enter the "STEM" fields - science, technology, engineering and mathematics. America's universities are not graduating nearly enough scientists, engineers and other skilled professionals to keep our country globally competitive in the decades ahead.

    And this is despite evidence such as a recent Center on Education and the Workforce report that forecasts skill requirements through 2018 and clearly shows the importance of STEM fields. The opportunities for those with just a high school education are restricted, it says - many high-paying jobs are open only to people with STEM college degrees.

    21
    Comments
    Weigh InCorrections?


    Still, as many as 60 percent of students who enter college with the intention of majoring in science and math change their plans. Because so many students intend to major in a STEM subject but don't follow through, many observers have assumed that universities are where the trouble starts. I beg to differ.

    I am a professor of astronomy and physics at Yale University, where I teach an introductory class in cosmology. I see the deficiencies that first-year students show up with. My students may have dexterity with the equations they're required to know, but they lack the capacity to apply their knowledge to real-life problems. This critical shortcoming appears in high school and possibly in elementary grades - long before college. If we want more Americans to pursue careers in STEM professions, we have to intervene much earlier than we imagined.

    Many efforts are underway to get younger students interested in science and math. One example is the Tree of Life's online "treehouse" project, a collection of information about biodiversity compiled by hundreds of experts and amateurs. Students can use this tool to apply what they are learning in the classroom to the world around them. Starting early in children's education, we need to provide these types of engaging, interactive learning environments that link school curricula to the outside world.

    My own schooling is an example. Growing up in Delhi, India, I did puzzles, explored numbers and searched for patterns in everyday settings long before I ever saw an equation. One assignment I vividly remember asked us to find examples of hexagons. I eagerly pointed out hexagons everywhere: street tiles, leaves, flowers, signs, buildings. I was taught equations only after I learned what they meant and how to think about them. As a result, I enjoyed math, and I became good at it.

    Not all American children have this experience, but they can. The Khan Academy, for example, has pioneered the use of technology to encourage unstructured learning outside the classroom and now provides teaching supplements in 36 schools around the country. For instance, recent reports describe a San Jose charter school using Khan's instructional videos in ninth-grade math classes to tailor lessons to each student's pace.
Bonnie Sutton

Guest post: An 'Arab Spring' of free online higher education By Daniel de Vise - 2 views

Free Higher Education online college courses Udacity Startup
started by Bonnie Sutton on 03 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Guest post: An 'Arab Spring' of free online higher education
    By Daniel de Vise

    http://www.washingtonpost.com/blogs/college-inc/post/guest-post-an-arab-spring-of-free-online-higher-education/2012/02/03/gIQAXiOFnQ_blog.html

    In recent days, we have heard President Obama lecture college presidents about cost control, and we have seen a vaunted Stanford professor quit to pursue teaching students by the millions online - at minimal cost.

    Here, to connect the dots, is a guest post from Abir Qasem, a computer scientist at Bridgewater College in Virginia, and Tanya Gupta, a senior resource management officer at the World Bank.


    Tanya Gupta (Tanya Gupta) Two recent events presage higher education's future. First, President Obama's University of Michigan speech about cost control in higher education and second, Sebastian Thrun's Udacity start-up.

    There is an immense pressure to do something about the prohibitive cost of higher education, immense enough to be the first key topic of the President's post-State of the Union tour at the University of Michigan. With this speech, the President brought the 500-pound gorilla into the national conversation. In the 21st century, he said, "higher education is not a luxury - it's an economic imperative," and institutions should "improve affordability" and ensure "higher rates of college completion".

    For students, it is not always clear that the return on their investment will be positive, as a degree no longer guarantees a job. If education is both expensive and has a low ROI, the demand for traditional education is likely to fall.

    And then there is Sebastian Thrun, a tenured professor of computer science at Stanford who, a few days ago, announced that he had given up his teaching role to found Udacity, an education start-up that would offer low-cost online classes. Thrun was inspired by the success of the online AI course he offered along with his colleague Peter Norvig to bring "education" directly to the consumer. Thrun said he was motivated in part by teaching practices that evolved too slowly to be effective. With this move Thrun replaced the prohibitive cost of a "middleman" (the College) with technology.

    When Christensen wrote in an earlier College Inc. blog post, "Technology and innovation make it possible to grow our way out of financial trouble and organizational resistance to change," he could have been writing about President Obama's imperative to cut the cost of education and Thrun's (and others') initiative to overcome resistance to change.

    The stars are aligned for this new disruption to emerge - whether you call it "the unbundling of the university," the "modularization of education" or "eliminating the middleman" (the College). Steve Jobs said, "You can't connect the dots looking forward; you can only connect them looking backwards." However, when some of the bread crumbs start to line up, it is an indication that a change is coming.

    The bread crumbs ? They are Sal Khan's Khan Academy, MITx, and most recently, Thrun's move. With Khan Academy, Sal Khan made available, free of cost, 2800+ educational videos to the whole word, all made in his own unique style, the way he wishes he had been taught. MIT Open Courseware (OCW) makes available MIT course content for free on the Web. MITx goes one step further and offers an online learning platform that will, in addition, feature interactivity, online laboratories, student-to-student communication and individual assessment, and will offer a certificate of completion awarded by MITx. MITx is based on an open-source, scalable software infrastructure. It will be free to use and "highly affordable" for those who wish to get a credential. Many other Khan Academies and MITxs are emerging. Academic Earth offers free access to video courses and academic lectures from leading colleges and universities. Udemy offers courses online, and Code Academy teaches you how to code online.

    So then, what exactly is the change? For thousands of years now, the university has been the middleman of the higher education system. The university provided the needed infrastructure, the branding, and an easy route to a white collar job or graduate school. In return, students had to agree to taking courses that the faculty thought were needed. The courses could be recommended because they would help the student understand the subject, or for other completely unrelated reasons (to make them a "well rounded" person, or to give a faculty colleague some students to teach). Faculty, on the other hand, did not have to look for students, could bask in the reflected glory of the university name, and still had a regular paycheck. Accreditors were the accountants of academia, making sure that "quality" was maintained.

    The astonishing pace of technology in the last few years has changed the landscape of academia completely in several ways:

    (1) There is an excess of information available. Instructors are no longer required to be a source of information. Rather, they curate existing information.

    (2) Students today want practical skills that they can use to get a job, and not necessarily a degree. Even if they don't use the same words, students are looking for outcomes (e.g. actionable skills, and not just knowledge about a subject).

    (3) Infrastructure, at least in the West, has improved to the extent that anyone with a video camera and basic tools can design, deliver, and take payment for courses.

    (4) Students are no longer just your typical 18-22 year olds. They can be a mom who wants to get a certification, a soldier in Afghanistan, or an office worker in Hanoi. Educators need to be flexible about the place, time, format and frequency of courses.

    (5) Technology has eliminated a lot of the manual work teachers (grading) and administrators (registration) used to do.

    (6) Students want short courses that utilize all the technology available (multimedia, social media, games).

    This has created a situation where technology is freely available and can let anyone teach or learn: students who want flexibility, teachers who can now become one-man or -woman universities. Yet many schools are still stuck in the past.

    Radical changes in educational content and delivery mechanisms will lead to an unbundling of the university as we know it. The natural question then becomes, What will it look like? We cannot say for sure, but we can at least outline some possibilities.

    We think it is possible that education will go the "Amazon route" or the "eBay route." Under the Amazon model of education, the focus will be on service delivery. One or two large providers will emerge from the rubble and provide courses much as Amazon does. Courses will be in the millions, with different providers, some celebrities (the Stephen King of lecturers) and some not. Pricing will thus be equally complex. Professors and courses will be rated, and you will be able to see the top 100 courses that help you learn to program, for instance. Of course, you will have a "wizard" that will help you figure out exactly what you need. Or we could go the eBay route, where courses will be auctioned off. In this model, too, you will have ratings of courses and providers. Delivery will take place once you have won the auction.

    In both models, anyone can be a seller or buyer of education. However, it will only be through the ratings that you can establish a history of being a good student or teacher.

    Which model wins? We have no idea. However, the first platform that allows a student to pick any course from a huge variety of courses within his price range or a quoted price, and customize it, will have an advantage.

    How will credentialing take place ? We think credentialing will go away, as the rating system will determine quality.

    How will grades be determined? This one is more difficult to answer, as an outcome-based education may not necessarily rely on grades, and outcomes are difficult to measure in an absolute way.

    What are traditional colleges to do? They cannot go completely online tomorrow. Christensen said that one innovation traditional colleges could make is to offer a few "gateway" majors, and then use technology to personalize and individualize teaching on specific subjects. Christensen notes that Western Governors University only offers four degrees and hasn't raised tuition in five years. We think another possibility is that you can have a group of universities collaborating ("the Ivies" or the "East Coast") and allowing students to pick professors at any university in the circle and dictate a combination of in-person/online courses.

    In any case, we think that the Arab Spring of higher education is already starting to take place, and will change the face of higher education in fundamental ways. Whether it is an Amazon model, or an eBay model, or some combination of both, we'll have to wait and see.

    By Daniel de Vise | 01:26 PM ET, 02/03/2012
Bonnie Sutton

Broadband, Social Networks, and Mobility Have Spawned a New Kind of Learner - 0 views

ctia. broadband mobility social nerworks new learner smart phones
started by Bonnie Sutton on 02 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    12/13/11
    Students are different today because of technology. Every educator knows this, of course, but this change is about much more than agile thumbs, shriveling attention spans, and OMG'd vocabularies. According the Pew Research Center, the combination of widespread access to broadband Internet connectivity, the popularity of social networking, and the near ubiquity of mobile computing is producing a fundamentally new kind of learner, one that is self-directed, better equipped to capture information, more reliant on feedback from peers, more inclined to collaborate, and more oriented toward being their own "nodes of production."

    "These three elements together have changed the context of learning," says Lee Rainie, director of the Pew Research Center's Internet and American Life Project. "Today, knowledge is literally at your fingertips."

    Rainie spoke to attendees at the 2011 State Educational Technology Directors Association (SETDA) Leadership Summit in Washington, DC. The Pew Center's Internet and American Life Project is a non-profit, non--partisan "fact tank" that studies the social impact of the Internet. Rainie is a co-author of Up for Grabs; Hopes and Fears; Ubiquity, Mobility, Security; and Challenges and Opportunities--books focused on the future of the Internet. He's also co-authoring a book, expected to debut in early 2012 from MIT Press, on the social impact of technology.

    "I don't have to have an opinion," Rainie joked during his keynote. "I just have to find out what's true."

    The Pew Center conducted its first survey of Internet behavior in 1999 and watched what Rainie called "the broadband revolution" unfold.

    "We watched as the world moved from a dial-up world to a broadband world," he said." The spread of broadband made it possible for students to become content creators. We know that three-quarters of Internet-connected teenagers now create content and share it online. It's not necessarily profound stuff--it's not War and Peace. They're sharing status updates; they're telling stories about their lives; they're reacting to things; and they're rating and ranking things. But that's the way people are using these new tools to tell stories about themselves."

    According to the Pew Center, 95 percent of teenagers now use the Internet, while 78 percent of adults use it. And 82 percent of teenagers (ages 12-17) have broadband at home, as opposed to 62 percent of adults.

    Broadband also inherently facilitates new forms of information dissemination, Rainie said.

    "Links have now become a central aspect of text," he said. "There are clear ways now that students assess and use information in the context of the links embedded in text. They will check primary sources, go back to original documents, and do a little bit more work to get there. Links have changed the way knowledge is presented; it's no longer linear. It's sometimes disrupted, scattered, and related to multimedia. There are ways now in the linked environment to pack more information into textbooks and other learning vehicles. You can do story telling in ways now that you never could."

    The proliferation of broadband helped to facilitate the rise of the social network, Rainie said. Students now turn to their social networks to help them in three ways:

    To act as sentries or early warning signals about what's going on in the world and what's the news in their social environment;
    To act as evaluators of information (Is it true? What does it mean? How much weight should I give it?); and
    To act as an audience.
    "We've all got audiences now on Twitter and Facebook," Rainie said. "Everybody can be a publisher and broadcaster; students in particular are taking advantage of that. Predictably, young people have an acute sense that they're sort of performing for the people in their social network, particularly the people who don't know them very well. They want to increase their reputation, increase their status, and build communities. Social networks are now primary places where people can kind of show off and strut their stuff."

    According to the Pew Center, 80 percent of teenagers who are online (76 percent of all teenagers) now participate in social networking sites like Facebook, though just 16 percent participate in Twitter, which is actually a microblogging service. Meanwhile, 65 percent of adult Internet users (50 percent of all adults) are now using social networking sites; 33 percent of those who are 65 and older and use the Internet now participate in social networking sites.

    The advent of broadband combined with the popularity of social networks has also given students the means to publish pictures and video, which is changing students in another important way, Rainie said.

    "When people start sharing pictures online, they become radically different social beings," he said. "They're thinking chronologically about their lives and sharing specific moments with their friends. This kind of information sharing has become a deeply rooted expectation. The visual element in social networking--posting pictures after a party or concert--is now almost as important as the texted information."

    Pulling these two "revolutions" together is the spread of mobile computing. Rainie cited some statistics from CTIA, the wireless trade lobby, which reported last year that there were 303 million total wireless subscriber lines in the United States. Just this quarter CTIA reported that that number had grown to 322.9 million.

    "There are more cell phones in America now than there are people," Rainie said. "And we're not the first to this party. Hong Kong, Singapore, Korea, and a couple of Scandinavian countries have more than 100 percent penetration of cell phones. And the numbers are only going to get bigger."

    Currently, about 30 percent of teenagers have smart phones--iPhone, Androids, etc.--that provide Internet connectivity; most teens have so-called feature phones, which allow them to send text messages, but not surf the Web.

    What's important about mobility in the context of education, Rainie said, is that it changes the way people think about the availability of information, knowledge, and learning.

    "It alters the places where learning takes place and expectations about where learning can take place," Rainie said. "When something is perceived to be available all the time, anywhere, on any device, it changes the way that anybody, but particularly students, thinks about how they can access the information and media they want on the schedule they want."

    Among the consequences of this confluence of trends is a massive inflow of data, which the enterprise refers to as "Big Data." These are datasets that have grown so large that they're hard to work with, that traditional database management tools can't handle.

    "Because the broadband environment has so increased the volume and velocity and variety of information in people's lives, analytics has become much more important," Rainie said. "We are entering the age of Big Data, and it's one of the trends we are going to be tracking at Pew. The big question is, how are we going to figure out what all these data are telling us about our students.

    About the Author

    John K. Waters is a freelance journalist and author based in Palo Alto, CA.
Bonnie Sutton

Digital Learning Day - 0 views

Ferdi Serim Digital Learning Day
started by Bonnie Sutton on 01 Feb 12 no follow-up yet
  • Bonnie Sutton
     
    Digital Learning Day - A Movement's Defining Moment by Ferdi Serim
    Posted on February 1st, 2012
    Digital Learning Day - A Movement's Defining Moment

    By Ferdi Serim

    With over four times the participants of Woodstock, Digital Learning Day may serve as the downbeat for a movement that cares as much about learning as "the Woodstock generation" cared about music over four decades ago. For those of us living then, Woodstock was about much more than music: it symbolized the cultural power inherent in the free expression of ideas, whether supported or not by the dominant culture. For Digital Learners, the genie is out of the bottle. No longer is learning limited by locality. No longer must instruction suffer from isolation. No longer can anyone limit what can be learned, by whom, from whom and when.

    This moment has been decades in coming. Personally, I've been involved as "midwife" for the birth of digital learning for over twenty years. In 1992, my sixth grade students participated via email and listserv with NASA physicists in a simulation of solar sailers (spacecraft powered by solar sails) on a mission to Mars, in commemoration of the 500th anniversary of Columbus' voyage. The Internet didn't have pictures yet, back then. Universities had Internet, some K12 schools and most homes were limited to dialup modems. What a difference we see today!

    Ubuntu is a Swahili word that means "I am who I am because of who we are together," and it captures the ethos of the open education resource movement. Digital Learning Day marks a moment when we have the opportunity to look up, look around and celebrate what millions of us are doing to move education forward.

    Surveying the dazzling array of activities taking place in 38 states, the cloud of despair that shrouds most education focused discussions today is dispelled by the passion, innovation and commitment of people who've personally experienced the power of Digital Learning. These initiatives can't be confined to a single day, but the genius of connecting all these innovators (who include students, parents, community members in addition to the expected policymakers, providers and school leaders) is what adds the quality of a defining moment to the enterprise.

    And yet, the challenge is huge. For too long, the goal of connecting classroom learning with real-life relevance has seemed beyond our grasp, with the vast potentials of digital age learning taking root in only a relative handful of pioneering educators' learning environments.

    Digital Learning Day challenges us to consider and explore innovations that require us, and everyone we work with (students, parents, peers and school leaders), to examine everything we do through a very rigorous lens: if what we do is helping kids prepare for a future filled with expanded opportunity, we will continue to do it; if not, we won't, and we will remove any obstacle that stands in the way.

    This commitment unavoidably puts us in the path of conflict and controversy, as old and new worlds collide. Until recently, the path of least resistance was "business as usual," but it is becoming clearer every day that we can no longer afford (on any level) to sustain practices that don't work. I acknowledge and salute your efforts to build the schools we need, starting now!

    What can you do, right now, to take practical steps that prepare yourself, your classroom and students for the power of Digital Learning? You can learn about strategies to incorporate the power of a blended model into the work you ask students to do.

    As fate would have it, the day shares the name of my new book: Digital Learning. This could not have been predicted eight years ago, when I began work on the book. The process by which this book came to exist is unusual, and is itself a journey into project-based learning. In 2003, I realized that progress in extending the benefits of digital learning as a right for all students would be blocked unless and until we had ways of assessing what was then the relatively new idea called "21st century skills." The measurement mania was already creating a hyperfocus on what is tested, marginalizing development of key capabilities whose growth could not be strengthened until they could be observed. This became my project.

    I wanted to find a way to help teachers simultaneously meet core content standards while developing what became known as 21st Century Skills. The vehicle, creating engaging projects that would be completed online by teams of students, grew into the strategies that I share in the book.

    In the ensuing years, we've seen the emergence of the Common Core State Standards and the requirement that schools "upgrade their game" to address College and Career Readiness. Digital Learning projects cause students to act and think in precisely the 21st century ways that are necessary for success in both the new standards and new levels of performance expected by employers and higher education. The practical processes explained in the book are designed to provide a bridge for school leaders, teachers and students that allow them to connect their prior knowledge and experiences in ways that lead to success in new, blended learning environments.

    Here are links to further information and resources:

    Digital Learning Day Website: http://www.digitallearningday.org/

    Digital Learning Day State events: http://www.digitallearningday.org/events/state-events

    Digital Learning Process Website: http://digitallearningprocess.net

    Digital Learning Process Resource Exchange: http://digitallearningprocess.schooltown.net

    Digital Learning Online Course: http://www.kdsi.org/CL-Digital-Learning.aspx

    Ferdi Serim helps people become more effective in "real life" by incorporating the power of digital learning communities focused on talent development. He has worked in many venues: Board Member of the International Society for Technology in Education (ISTE); the New Mexico Public Education Department's EdTech Director, Reading First Director, Program Manager for Literacy, Technology & Standards; Board Member of the Consortium for School Networking (CoSN), Innovate+Educate, and Education360; director of the Online Internet Institute (OII); Associate of the David Thornburg Center for Professional Development (and jazz musician). Most recently, he has launched CLARO Consulting: Community Learning and Resource Optimization, focused on talent development through knowledge capture and transfer.
Bonnie Sutton

Michelle Rhee's empty claims about her D.C. schools record - 1 views

Rhee false test data reporting Shanker Institute
started by Bonnie Sutton on 31 Jan 12 no follow-up yet
  • Bonnie Sutton
     
    Michelle Rhee's empty claims about her D.C. schools record
    http://www.washingtonpost.com/blogs/answer-sheet/post/michelle-rhees-empty-claims-about-her-dc-schools-record/2012/01/30/gIQAATFjdQ_blog.html

    By Valerie Strauss
    This was written by Matthew Di Carlo, senior fellow at the non-profit Albert Shanker Institute, located in Washington, D.C. This post originally appeared on the institute's blog.



    By Matthew Di Carlo

    Michelle Rhee, the controversial former chancellor of D.C. public schools, is a lightning rod. Her confrontational style has made her many friends as well as enemies. As is usually the case, people's reaction to her approach in no small part depends on whether or not they support her policy positions.

    I try to be open-minded toward people with whom I don't often agree, and I can certainly accept that people operate in different ways. I have no doubt as to Ms. Rhee's sincere belief in what she's doing; and, even if I think she could go about it differently, I respect her willingness to absorb so much negative reaction in order to try to get it done.

    What I find disturbing is how she continues to try to build her reputation and advance her goals based on interpretations of testing results that are insulting to the public's intelligence.

    In a recent New York Daily News op-ed, Ms. Rhee once again offered up testing results during her time at the helm of D.C. schools as evidence that her policy preferences work. In supporting New York City Mayor Michael Bloomberg's newly announced plan to give bonuses to the city's teachers, she pointed to her own plan in D.C., as a result of which, she said ,"teachers are being rewarded for great work."

    In the very next sentence, she goes on to say:



    That's not all. D.C. students have made strong academic gains. This progress by the city's children, who were lagging so far behind their peers around the country, has been demonstrated on both city-administered tests and the federal test known as the National Assessment of Educational Progress. High school graduation rates also rose, and the city experienced public school enrollment increases for the first time in four decades. Sure, merit pay alone did not produce all of these successes. There is unfortunately no single solution that by itself can solve the many problems facing our schools.



    In other words, she's saying that the performance bonus program produced some - but not all - of these "successes."

    In order to assess the utter emptiness of this claim, we need to do a little housekeeping of her premises. Let's put aside the fact that the test results to which she refers are not "gains" but cohort changes (comparisons between two different groups of students) and that these results were almost certainly influenced by changes in demographics among DCPS's rapidly-changing student body. Let's also ignore the fact that, in the case of D.C.'s state tests (the DC-CAS), the district only releases proficiency rates and not test scores, which means that we really don't know much about the actual performance of the "typical student."

    Finally, and most importantly, let's dismiss all the tried and true principles of policy analysis and assume that we can actually judge the effectiveness of a particular policy intervention by looking at changes in raw test scores immediately after it is implemented. In other words, we'll assume that merit pay itself is at least partially responsible for any changes in scores that coincide with its implementation, rather than all the other policies and factors, school- and non-school, that influence results.

    Even if we ignore the fact that all these premises are, at best, highly problematic - and give her the benefit of the doubt - the evidence she is using supports a conclusion that is opposite from the one she reaches.

    DCPS's NAEP scores and state test proficiency rates increased quite a bit between 2007 and 2009. Michelle Rhee's performance pay plan awarded its first bonuses based on teacher evaluation results for the 2009-10 school year (the bonus amounts also depended on other factors, such as the poverty level of the school in which teachers work).

    Since that time, DCPS performance on both tests has been largely flat.

    DC-CAS proficiency rates for elementary school students are actually a few percentage points lower than they were in 2009, while the rates among secondary students are a few points higher. These are both rather modest (and perhaps not statistically significant) two-year changes. Although the timing of the NAEP TUDA test does not coincide perfectly with the start of the DC bonus program, scores were also statistically unchanged between 2009 and 2011 in three out of the four NAEP tests (fourth grade math, and fourth and eighth grade reading), while there was a moderate discernible increase in the average eighth grade math score.

    For the most part, then, there was little meaningful change in DCPS testing performance over the past two full school years.

    Moreover, the graduation rate was also essentially unchanged - moving from 72 percent in 2009 to 73 percent in 2010 (2011 rates will be released later this year).

    So, according to the standards by which Ms. Rhee judges policy effects - and advocates for their expansion - her performance bonus program has not worked. And, for the record, the same thing goes for her other signature policies, including D.C.'s new evaluation system and the annual dismissals based on the results of that system.

    Yet, on the pages of a major newspaper, she's using the same evidence to say they've succeeded. It's not only unsupportable - it's downright nonsensical.

    Luckily for Ms. Rhee, the standard by which she would have her policies be judged is completely inappropriate. The relative success or failure of her bonus program, new teacher evaluation system, and all her other policies is an open question - one that must be addressed by high-quality program evaluations that are specifically designed to isolate these policies' effects from all the other factors that affect performance. She is also fortunate that, insofar as the purpose of merit pay plans, according to proponents, is to attract "better candidates" to the district and keep them around, the outcomes of this program - if there are any - would take several years to even begin to show up. We therefore should not judge this policy based on short-term testing (or non-testing) outcomes.

    Put simply, Ms. Rhee's own evidentiary standards are so flawed that we must ignore the fact that, in this case, if anything, they actually work against her.

    Michelle Rhee, while hardly alone in misinterpreting data to support policy beliefs, is a national figure. In many respects, she is the most prominent voice among the market-based reform crowd. Some of her ideas may even have some merit, and she is trying to make her case, but she does herself, her supporters, and the public a disservice by continuing to abuse evidence in an attempt to make it. This cheapens the debate and perpetuates a flawed understanding of the uses of data and research to inform policy. That benefits nobody, especially students.

    The views expressed in this post do not necessarily reflect the views of the Albert Shanker Institute, its officers, board members, or any related entity or organization.



    Follow The Answer Sheet every day by bookmarking http://www.washingtonpost.com/blogs/answer-sheet
Bonnie Sutton

The True Cost of High School Dropouts - 3 views

started by Bonnie Sutton on 31 Jan 12 no follow-up yet
  • Bonnie Sutton
     
    The True Cost of High School Dropouts
    By HENRY M. LEVIN and CECILIA E. ROUSE
    Published: January 25, 2012

    http://www.nytimes.com/2012/01/26/opinion/the-true-cost-of-high-school-dropouts.html

    ONLY 21 states require students to attend high school until they graduate or turn 18. The proposal President Obama announced on Tuesday night in his State of the Union address - to make such attendance compulsory in every state - is a step in the right direction, but it would not go far enough to reduce a dropout rate that imposes a heavy cost on the entire economy, not just on those who fail to obtain a diploma.
    Enlarge This Image

    Oliver Munday and Ryan LeCluyse
    In 1970, the United States had the world's highest rate of high school and college graduation. Today, according to the Organization for Economic Cooperation and Development, we've slipped to No. 21 in high school completion and No. 15 in college completion, as other countries surpassed us in the quality of their primary and secondary education.

    Only 7 of 10 ninth graders today will get high school diplomas. A decade after the No Child Left Behind law mandated efforts to reduce the racial gap, about 80 percent of white and Asian students graduate from high school, compared with only 55 percent of blacks and Hispanics.

    Like President Obama, many reformers focus their dropout prevention efforts on high schoolers; replacing large high schools with smaller learning communities where poor students can get individualized instruction from dedicated teachers has been shown to be effective. Rigorous evidence gathered over decades suggests that some of the most promising approaches need to start even earlier: preschool for 3- and 4-year-olds, who are fed and taught in small groups, followed up with home visits by teachers and with group meetings of parents; reducing class size in the early grades; and increasing teacher salaries from kindergarten through 12th grade.

    These programs sound expensive - some Americans probably think that preventing 1.3 million students from dropping out of high school each year can't be done - but in fact the costs of inaction are far greater.

    High school completion is, of course, the most significant requirement for entering college. While our economic competitors are rapidly increasing graduation rates at both levels, we continue to fall behind. Educated workers are the basis of economic growth - they are especially critical as sources of innovation and productivity given the pace and nature of technological progress.

    If we could reduce the current number of dropouts by just half, we would yield almost 700,000 new graduates a year, and it would more than pay for itself. Studies show that the typical high school graduate will obtain higher employment and earnings - an astonishing 50 percent to 100 percent increase in lifetime income - and will be less likely to draw on public money for health care and welfare and less likely to be involved in the criminal justice system. Further, because of the increased income, the typical graduate will contribute more in tax revenues over his lifetime than if he'd dropped out.

    When the costs of investment to produce a new graduate are taken into account, there is a return of $1.45 to $3.55 for every dollar of investment, depending upon the educational intervention strategy. Under this estimate, each new graduate confers a net benefit to taxpayers of about $127,000 over the graduate's lifetime. This is a benefit to the public of nearly $90 billion for each year of success in reducing the number of high school dropouts by 700,000 - or something close to $1 trillion after 11 years. That's real money - and a reason both liberals and conservatives should rally behind dropout prevention as an element of economic recovery, leaving aside the ethical dimensions of educating our young people.

    Some might argue that these estimates are too large, that the relationships among the time-tested interventions, high school graduation rates and adult outcomes have not been proved yet on a large scale. Those are important considerations, but the evidence cannot be denied: increased education does, indeed, improve skill levels and help individuals to lead healthier and more productive lives. And despite the high unemployment rate today, we have every reason to believe that many of these new graduates would find work - our history is filled with sustained periods of economic growth when increasing numbers of young people obtained more schooling and received large economic benefits as a result.

    Of course, there are other strategies for improving educational attainment - researchers learn more every day about which are effective and which are not. But even with what we know, a failure to substantially reduce the numbers of high school dropouts is demonstrably penny-wise and pound-foolish.

    Proven educational strategies to increase high school completion, like high-quality preschool, provide returns to the taxpayer that are as much as three and a half times their cost. Investing our public dollars wisely to reduce the number of high school dropouts must be a central part of any strategy to raise long-run economic growth, reduce inequality and return fiscal health to our federal, state and local governments.

    Henry M. Levin is a professor of economics and education at Teachers College, Columbia University. Cecilia E. Rouse, a professor of economics and public affairs at Princeton University, was a member of President Obama's Council of Economic Advisers from 2009 to 2011.
« First ‹ Previous 101 - 120 of 376 Next › Last »
Showing 20 items per page