Skip to main content

Home/ CTLT and Friends/ Group items matching "informal" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Sarah Usher

Police Jobs Through Police-Recruitment UK - 1 views

I was searching for police force jobs that will suit the qualifications that I have. I searched in offices and online until I came across Police-Recruitment UK. I was able to set my sights on a sp...

police force jobs

started by Sarah Usher on 06 Sep 11 no follow-up yet
Sarah Usher

Pass the Police Recruitment Process in One Attempt - 1 views

I was so happy PoliceRecruitmentUK provided me a lot of information about the police recruitment process! They showed me tips and information on what to expect during the selection process. That ...

police recruitment

started by Sarah Usher on 13 Jul 11 no follow-up yet
Sarah Usher

I am Now a Police Officer in Kent - 2 views

PoliceRecruitmentUK really helped me a lot in the police recruitment process. They gave me all the necessary information on how to pass the process and become a police officer. I never expected I ...

police jobs

started by Sarah Usher on 03 Jun 11 no follow-up yet
Sarah Usher

I am Now a Police Officer in Kent - 2 views

PoliceRecruitmentUK really helped me a lot in the police recruitment process. They gave me all the necessary information on how to pass the process and become a police officer. I never expected I ...

police jobs

started by Sarah Usher on 03 Jun 11 no follow-up yet
Sarah Usher

One Step Closer to My Dream - 3 views

My father was a police officer and he died protecting people and making this world a better place. All my life, I always wanted to follow in my father's footsteps and follow a path with police care...

police careers assessment education accountability higher_education

started by Sarah Usher on 01 Jun 11 no follow-up yet
Sarah Usher

One Step Closer to My Dream - 3 views

My father was a police officer and he died protecting people and making this world a better place. All my life, I always wanted to follow in my father's footsteps and follow a path with police care...

police careers assessment education accountability

started by Sarah Usher on 01 Jun 11 no follow-up yet
Joshua Yeidel

Evaluating Teachers: The Important Role of Value-Added [pdf] - 1 views

  •  
    "We conclude that value-added data has an important role to play in teacher evaluation systems, but that there is much to be learned about how best to use value-added information in human resource decisions." No mention of the role of assessment in improvement.
Nils Peterson

U. of Phoenix Reports on Students' Academic Progress - Measuring Stick - The Chronicle of Higher Education - 0 views

  • In comparisons of seniors versus freshmen within the university, the 2,428 seniors slightly outperformed 4,003 freshmen in all categories except natural sciences, in which they were equivalent.
    • Nils Peterson
       
      This is the value added measure.
  • The University of Phoenix has released its third “Academic Annual Report,” a document that continues to be notable not so much for the depth of information it provides on its students’ academic progress but for its existence at all.
    • Nils Peterson
       
      Provides a range of measures, inc. demographics, satisfaction, indirect measures of percieved utility and direct measures using national tests.
  • The Phoenix academic report also includes findings on students’ performance relative to hundreds of thousands of students at nearly 400 peer institutions on two standardized tests
  • ...1 more annotation...
  • University of Phoenix seniors slightly underperformed a comparison group of 42,649 seniors at peer institutions in critical thinking, humanities, social sciences, and natural sciences, and moderately underperformed the peer group in reading, writing, and mathematics.
Gary Brown

Sincerity in evaluation - highlights and lowlights « Genuine Evaluation - 3 views

  • Principles of Genuine Evaluation When we set out to explore the notion of ‘Genuine Evaluation’, we identified 5 important aspects of it: VALUE-BASED -transparent and defensible values (criteria of merit and worth and standards of performance) EMPIRICAL – credible evidence about what has happened and what has caused this, USABLE – reported in such a way that it can be understood and used by those who can and should use it (which doesn’t necessarily mean it’s used or used well, of course) SINCERE – a commitment by those commissioning evaluation to respond to information about both success and failure (those doing evaluation can influence this but not control it) HUMBLE – acknowledges its limitations From now until the end of the year, we’re looking at each of these principles and collecting some of the highlights and lowlights  from 2010 (and previously).
  • Sincerity of evaluation is something that is often not talked about in evaluation reports, scholarly papers, or formal presentations, only discussed in the corridors and bars afterwards.  And yet it poses perhaps the greatest threat to the success of individual evaluations and to the whole enterprise of evaluation.
Nils Peterson

There is No College Cost Crisis - NYTimes.com - 3 views

  • “[A] modern university must provide students with an up-to-date education that familiarizes students with the techniques and associated machinery that are used in the workplace the students must enter.”
    • Nils Peterson
       
      the author means information technologies, but one might also talk about certain habits of mind which are associated with trends in the workplace the students must enter.
  • The causes of the increase in college costs (an increase that has not, they contend, put college “out of reach”) are external; colleges are responding, as they must, to changes they cannot ignore and still provide a quality product.
    • Nils Peterson
       
      Makes me want to re-visit Christiansen's Innovator's Dilemma, who talks about a business focusing on serving its best customer and losing focus on emerging products and markets that operate at lower price points.
  •  
    Stanley Fish reviews "Why Does College Cost So Much?" bu Archibald and Feldman, two economists who conclude "there is no college cost crisis".
Nils Peterson

Nonacademic Members Push Changes in Anthropology Group - Faculty - The Chronicle of Higher Education - 1 views

  • Cathleen Crain, an anthropologist who runs a consulting firm near Washington: "There is a growing vision of a unified anthropology, where academics informs practice and practice informs academics."
    • Nils Peterson
       
      Anthropology is having a conversation about stakeholders and this is impacting the national anthro organization. I wonder if its producing metrics that might inform student learning outcomes work.
Gary Brown

Researchers Criticize Reliability of National Survey of Student Engagement - Students - The Chronicle of Higher Education - 3 views

  • "If each of the five benchmarks does not measure a distinct dimension of engagement and includes substantial error among its items, it is difficult to inform intervention strategies to improve undergraduates' educational experiences,"
  • nly one benchmark, enriching educational experiences, had a significant effect on the seniors' cumulative GPA.
  • Other critics have asserted that the survey's mountains of data remain largely ignored.
  •  
    If the results are largely ignored, the psychometric integrity matters little.  There is no indication it is ignored because it lacks psychometric integrity.
Nils Peterson

Community Colleges Must Focus on Quality of Learning, Report Says - Students - The Chronicle of Higher Education - 0 views

  • Over all, 67 percent of community-college students said their coursework often involved analyzing the basic elements of an idea, experience, or theory; 59 percent said they frequently synthesized ideas, information, and experiences in new ways. Other averages were lower: 56 percent of students, for example, reported being regularly asked to examine the strengths or weaknesses of their own views on a topic. And just 52 percent of students said they often had to make judgments about the value or soundness of information as part of their academic work.
    • Nils Peterson
       
      one wonders, who is the stakeholder that commissioned this assessment and thinks these are important -- looks like the CITR might be underlying their thinking.
Gary Brown

Community Colleges Must Focus on Quality of Learning, Report Says - Students - The Chronicle of Higher Education - 0 views

  • Increasing college completion is meaningless unless certificates and degrees represent real learning, which community colleges must work harder to ensure, says a report released on Thursday by the Center for Community College Student Engagement.
  • This year's report centers on "deep learning," or "broadly applicable thinking, reasoning, and judgment skills—abilities that allow individuals to apply information, develop a coherent world view, and interact in more meaningful ways."
  • 67 percent of community-college students said their coursework often involved analyzing the basic elements of an idea, experience, or theory; 59 percent said they frequently synthesized ideas, information, and experiences in new ways. Other averages were lower: 56 percent of students, for example, reported being regularly asked to examine the strengths or weaknesses of their own views on a topic. And just 52 percent of students said they often had to make judgments about the value or soundness of information as part of their academic work.
  • ...5 more annotations...
  • One problem may be low expectations,
  • 37 percent of full-time community-college students spent five or fewer hours a week preparing for class. Nineteen percent of students had never done two or more drafts of an assignment, and 69 percent had come to class unprepared at least once.
  • Nearly nine in 10 entering students said they knew how to get in touch with their instructors outside of class, and the same proportion reported that at least one instructor had learned their names. But more than two-thirds of entering students and almost half of more-seasoned students said they had never discussed ideas from their coursework with instructors outside of class.
  • This year's report also strongly recommends that colleges invest more in professional development, for part-time as well as full-time faculty. "The calls for increased college completion come at a time of increasing student enroll­ments and draconian budget cuts; and too often in those circumstances, efforts to develop faculty and staff take low priority,"
  • Lone Star College's Classroom Research Initiative, a form of professional development based on inquiry. Since last year, about 30 faculty members from the community college's five campuses have collaborated to examine assessment data from the report's surveys and other sources and to propose new ways to try to improve learning.
Gary Brown

71 Presidents Pledge to Improve Their Colleges' Teaching and Learning - Faculty - The Chronicle of Higher Education - 0 views

  • In a venture known as the Presidents' Alliance for Excellence in Student Learning and Accountability, they have promised to take specific steps to gather more evidence about student learning, to use that evidence to improve instruction, and to give the public more information about the quality of learning on their campuses.
  • The 71 pledges, officially announced on Friday, are essentially a dare to accreditors, parents, and the news media: Come visit in two years, and if we haven't done these things, you can zing us.
  • deepen an ethic of professional stewardship and self-regulation among college leaders
  • ...4 more annotations...
  • Beginning in 2011, all first-year students at Westminster will be required to create electronic portfolios that reflect their progress in terms of five campuswide learning goals. And the college will expand the number of seniors who take the Collegiate Learning Assessment, so that the test can be used to help measure the strength of each academic major.
  • "The crucial thing is that all of our learning assessments have been designed and driven by the faculty," says Pamela G. Menke, Miami Dade's associate provost for academic affairs. "The way transformation of learning truly occurs is when faculty members ask the questions, and when they're willing to use what they've found out to make change.
  • Other assessment models might point some things out, but they won't be useful if faculty members don't believe in them."
  • "In the long term, as more people join, I hope that the Web site will provide a resource for the kinds of innovations that seem to be successful," he says. "That process might be difficult. Teaching is an art, not a science. But there is still probably a lot that we can learn from each other."
Nils Peterson

Jeff Sheldon on the Readiness for Organizational Learning and Evaluation instrument | AEA365 - 4 views

shared by Nils Peterson on 01 Nov 10 - No Cached
  • The ROLE consists of 78 items grouped into six major constructs: 1) Culture, 2) Leadership, 3) Systems and Structures, 4) Communication, 5) Teams, and 6) Evaluation.
    • Nils Peterson
       
      You can look up the book in Amazon and then view inside and search for Appendix A and read the items in the survey. http://www.amazon.com/Evaluation-Organizations-Systematic-Enhancing-Performance/dp/0738202681#reader_0738202681 This might be useful to OAI in assessing readiness (or understanding what in the university culture challenges readiness) OR it might inform our revision (or justify staying out) of our rubric. An initial glance would indicate that there are some cultural constructs in the university that are counter-indicated by the analysis of the ROLE instrument.
  •  
    " Readiness for Organizational Learning and Evaluation (ROLE). The ROLE (Preskill & Torres, 2000) was designed to help us determine the level of readiness for implementing organizational learning, evaluation practices, and supporting processes"
  •  
    An interesting possibility for a Skylight survey (but more reading needed)
Gary Brown

Does testing for statistical significance encourage or discourage thoughtful data analysis? « Genuine Evaluation - 1 views

  • Does testing for statistical significance encourage or discourage thoughtful data analysis? Posted by Patricia Rogers on October 20th, 2010
  • Epidemiology, 9(3):333–337). which argues not only for thoughtful interpretation of findings, but for not reporting statistical significance at all.
  • We also would like to see the interpretation of a study based not on statistical significance, or lack of it, for one or more study variables, but rather on careful quantitative consideration of the data in light of competing explanations for the findings.
  • ...6 more annotations...
  • we prefer a researcher to consider whether the magnitude of an estimated effect could be readily explained by uncontrolled confounding or selection biases, rather than simply to offer the uninspired interpretation that the estimated effect is significant, as if neither chance nor bias could then account for the findings.
  • Many data analysts appear to remain oblivious to the qualitative nature of significance testing.
  • statistical significance is itself only a dichotomous indicator.
  • it cannot convey much useful information
  • Even worse, those two values often signal just the wrong interpretation. These misleading signals occur when a trivial effect is found to be ’significant’, as often happens in large studies, or when a strong relation is found ’nonsignificant’, as often happens in small studies.
  • Another useful paper on this issue is Kristin Sainani, (2010) “Misleading Comparisons: The Fallacy of Comparing Statistical Significance”Physical Medicine and Rehabilitation, Vol. 2 (June), 559-562 which discusses the need to look carefully at within-group differences as well as between-group differences, and at sub-group significance compared to interaction. She concludes: ‘Readers should have a particularly high index of suspicion for controlled studies that fail to report between-group comparisons, because these likely represent attempts to “spin” null results.”
  •  
    and at sub-group significance compared to interaction. She concludes: 'Readers should have a particularly high index of suspicion for controlled studies that fail to report between-group comparisons, because these likely represent attempts to "spin" null results."
Judy Rumph

Views: Why Are We Assessing? - Inside Higher Ed - 1 views

  • Amid all this progress, however, we seem to have lost our way. Too many of us have focused on the route we’re traveling: whether assessment should be value-added; the improvement versus accountability debate; entering assessment data into a database; pulling together a report for an accreditor. We’ve been so focused on the details of our route that we’ve lost sight of our destinatio
  • Our destination, which is what we should be focusing on, is the purpose of assessment. Over the last decades, we've consistently talked about two purposes of assessment: improvement and accountability. The thinking has been that improvement means using assessment to identify problems — things that need improvement — while accountability means using assessment to show that we're already doing a great job and need no improvement. A great deal has been written about the need to reconcile these two seemingly disparate purposes.
  • The most important purpose of assessment should be not improvement or accountability but their common aim: everyone wants students to get the best possible education
  • ...7 more annotations...
  • Our second common purpose of assessment should be making sure not only that students learn what’s important, but that their learning is of appropriate scope, depth, and rigo
  • Third, we need to accept how good we already are, so we can recognize success when we see i
  • And we haven’t figured out a way to tell the story of our effectiveness in 25 words or less, which is what busy people want and nee
  • Because we're not telling the stories of our successful outcomes in simple, understandable terms, the public continues to define quality using the outdated concept of inputs like faculty credentials, student aptitude, and institutional wealth — things that by themselves don’t say a whole lot about student learning.
  • And people like to invest in success. Because the public doesn't know how good we are at helping students learn, it doesn't yet give us all the support we need in our quest to give our students the best possible education.
  • But while virtually every college and university has had to make draconian budget cuts in the last couple of years, with more to come, I wonder how many are using solid, systematic evidence — including assessment evidence — to inform those decisions.
  • Now is the time to move our focus from the road we are traveling to our destination: a point at which we all are prudent, informed stewards of our resources… a point at which we each have clear, appropriate, justifiable, and externally-informed standards for student learning. Most importantly, now is the time to move our focus from assessment to learning, and to keeping our promises. Only then can we make higher education as great as it needs to be.
  •  
    Yes, this article resonnated with me too. Especially connecting assessment to teaching and learning. The most important purpose of assessment should be not improvement or accountability but their common aim: everyone wants students to get the best possible education.... today we seem to be devoting more time, money, thought, and effort to assessment than to helping faculty help students learn as effectively as possible. When our colleagues have disappointing assessment results, and they don't know what to do to improve them, I wonder how many have been made aware that, in some respects, we are living in a golden age of higher education, coming off a quarter-century of solid research on practices that promote deep, lasting learning. I wonder how many are pointed to the many excellent resources we now have on good teaching practices, including books, journals, conferences and, increasingly, teaching-learning centers right on campus. I wonder how many of the graduate programs they attended include the study and practice of contemporary research on effective higher education pedagogies. No wonder so many of us are struggling to make sense of our assessment results! Too many of us are separating work on assessment from work on improving teaching and learning, when they should be two sides of the same coin. We need to bring our work on teaching, learning, and assessment together.
Theron DesRosier

Assessing Learning Outcomes at the University of Cincinnati: Comparing Rubric Assessments to Standardized Tests - 2 views

  •  
    "When the CLA results arrived eight months later, the UC team compared the outcomes of the two assessments. "We found no statistically significant correlation between the CLA scores and the portfolio scores," Escoe says. "In some ways, it's a disappointing finding. If we'd found a correlation, we could tell faculty that the CLA, as an instrument, is measuring the same things that we value and that the CLA can be embedded in a course. But that didn't happen." There were many factors that may have contributed to the lack of correlation, she says, including the fact that the CLA is timed, while the rubric assignments are not; and that the rubric scores were diagnostic and included specific feedback, while the CLA awarded points "in a black box": if a student referred to a specific piece of evidence in a critical-thinking question, he or she simply received one point. In addition, she says, faculty members may have had exceptionally high expectations of their honors students and assessed the e-portfolios with those high expectations in mind-leading to results that would not correlate to a computer-scored test. In the end, Escoe says, the two assessments are both useful, but for different things. The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. "
  •  
    Another institution trying to make sense of the CLA. This study compared student's CLA scores with criteria-based scores of their eportfolios. The study used a modified version of the VALUE rubrics developed by the AACU. Our own Gary Brown was on the team that developed the critical thinking rubric for the VALUE project.
  •  
    "The CLA can provide broad institutional data that satisfies VSA requirements, while rubric-based assessment provides better information to facilitate continuous program improvement. " This begs some questions: what meaning can we attach to these two non-correlated measures? What VSA requirements can rubric-based assessment NOT satisfy? Are those "requirements" really useful?
Gary Brown

Book review: Taking Stock: Research on Teaching and Learning in Higher Education « Tony Bates - 2 views

  • Christensen Hughes, J. and Mighty, J. (eds.) (2010) Taking Stock: Research on Teaching and Learning in Higher Education Montreal QC and Kingston ON: McGill-Queen’s University Press, 350 pp, C$/US$39.95
  • ‘The impetus for this event was the recognition that researchers have discovered much about teaching and learning in higher education, but that dissemination and uptake of this information have been limited. As such, the impact of educational research on faculty-teaching practice and student-learning experience has been negligible.’
  • Julia Christensen Hughes
  • ...10 more annotations...
  • Chapter 7: Faculty research and teaching approaches Michael Prosser
  • What faculty know about student learning Maryellen Weimer
  • ractices of Convenience: Teaching and Learning in Higher Education
  • Chapter 8: Student engagement and learning: Jillian Kinzie
  • (p. 4)
  • ‘much of our current approach to teaching in higher education might best be described as practices of convenience, to the extent that traditional pedagogical approaches continue to predominate. Such practices are convenient insofar as large numbers of students can be efficiently processed through the system. As far as learning effectiveness is concerned, however, such practices are decidedly inconvenient, as they fall far short of what is needed in terms of fostering self-directed learning, transformative learning, or learning that lasts.’
  • p. 10:
  • …research suggests that there is an association between how faculty teach and how students learn, and how students learn and the learning outcomes achieved. Further, research suggests that many faculty members teach in ways that are not particularly helpful to deep learning. Much of this research has been known for decades, yet we continue to teach in ways that are contrary to these findings.’
  • ‘There is increasing empirical evidence from a variety of international settings that prevailing teaching practices in higher education do not encourage the sort of learning that contemporary society demands….Teaching remains largely didactic, assessment of student work is often trivial, and curricula are more likely to emphasize content coverage than acquisition of lifelong and life-wide skills.’
  • What other profession would go about its business in such an amateurish and unprofessional way as university teaching? Despite the excellent suggestions in this book from those ‘within the tent’, I don’t see change coming from within. We have government and self-imposed industry regulation to prevent financial advisers, medical practitioners, real estate agents, engineers, construction workers and many other professions from operating without proper training. How long are we prepared to put up with this unregulated situation in university and college teaching?
1 - 20 of 127 Next › Last »
Showing 20 items per page