Skip to main content

Home/ NBISE Institute/ Group items tagged research

Rss Feed Group items tagged

dhtobey Tobey

Cyberlearning: Transforming Education (nsf10620) - 0 views

  • NSF 10-620
  • Research supported by the Cyberlearning program will explore the opportunities for learning made possible by new technologies, how to help learners capitalize on those opportunities, new practices that are made possible by learning technologies, and ways of using technology to promote deep and lasting learning of content, practices, skills, attitudes, and/or dispositions needed for engaged and productive citizenship. Cyberlearning research will marry what is known about how people learn with advances in information and communications technologies to advance understanding of how to cultivate a citizenry that engages productively in learning both in and out of school and throughout a lifetime; and that possesses the knowledge, knowledge building, collaboration and reasoning capabilities to make informed decisions and judgments about problems ranging from their immediate lives to ethics, privacy, and security concerns to global challenges such as war and peace, economics, health and wellbeing, and the environment.
  • The goals of the Cyberlearning program are: To better understand how people learn with technology and how technology can be used productively to help people learn, through individual use and/or through collaborations mediated by technology; To better use technology for collecting, analyzing, sharing, and managing data to shed light on learning, promoting learning, and designing learning environments; and To design new technologies for these purposes, and advance understanding of how to use those technologies and integrate them into learning environments so that their potential is fulfilled.
  • ...18 more annotations...
  • Every project should therefore seek to answer questions about how to better promote learning, how to promote better learning, or how learning happens in technology-rich environments (including exploring relationships between people and technology that result in productive learning and access provided with technology to learning resources, such as data and scientific information). Each project should also focus, concurrently, on innovative technology design, ways of coherently integrating technologies for learning, and/or the integration of such technology into targeted learning environments. Especially sought are projects in which technology allows the tailoring of learning experiences to special needs and interests of groups or individuals, as well as ways in which technology allows expanding education beyond classroom settings
  • It is expected that all projects will advance understanding about how people learn with technology, how to use technology to help people learn, and/or how to use technology to enhance assessment or education practices
  • These projects may be of several different types:
  • Proposals should make clear the roles of all team members (PIs, supporting investigators, advisors, and others), why the proposed team is an appropriate one, and what expertise each team member brings. Teams should include members who have experience with the learners and environments being targeted and who are expert at relevant engagement and learning issues. Proposers should make clear the challenges associated with assessment and evaluation, robustness and broader usability that they anticipate, and the team members that will help with each of these.
  • Project proposers should also include on their teams people who can help them plan towards fulfilling the transformational potential of their work, including, as appropriate, those who can help them transition their technology to broad use and those from stakeholder groups who will need to be integrated into the project as innovations move towards scalability, broad dissemination, and continuation over time.
  • Integration and Deployment Projects (INDP Projects)
  • Since successful collaborative research depends on thoughtful coordination mechanisms, a Collaboration Plan is required for all proposals involving multiple investigators. The length of and level of detail provided in the Collaboration Plan should be commensurate with the complexity of the collaboration.
  • They may advance understanding of how to productively integrate a variety of established technologies to better promote learning or promote better learning in a target population and environment. They may provide guidelines on extending the usage of some promising technology or technologies over a larger variety of learner populations, advancing understanding of how to better address learning needs of different populations. They may provide guidelines on extending the usage of some promising technology or technologies over a larger variety of learning contexts, advancing understanding of learning processes that underlie disciplinary areas or the constraints and affordances (opportunities offered) of different environments for learning. They may combine advances in two or more of these areas.
  • It is expected that technologies will be deployed and evaluated in a large variety of learning environments, that by the end of the project, the technology will be ready for technology transfer and commercialization, and that the guidelines proposed will be broadly applicable beyond the particular technology being deployed. By later years of the project, facilitation of technology use should be done by those who would naturally be the facilitators in the chosen learning environment (e.g., teachers, scout leaders, parents, peers). Formative analyses: As for DIP projects, formative analyses should answer questions about usability, learning, effective and sustained use, as well as issues associated with scale-up, sustainability, workforce development, and long-term efficacy (as appropriate).
  • It will be appropriate for many proposals to include the development of innovative curricula or educational materials in addition to proposing technological innovations.
  • A successful research project should be potentially transformative; grounded in existing learning and education research; seek to answer questions about learning with technology; measure learning gains, take into account appropriate elements of the learning ecology in designing its innovation, evaluating its innovation, and answering research questions; include team members with all necessary expertise, including expertise for outreach and dissemination; be aware of potential scalability and sustainability issues; and use appropriate methodologies to evaluate innovations and measure learning gains. Our expectation is that many grants made by this program will seed long-term research enterprises. The transformative potential of proposed projects may be many years out, so proposers should make clear what that potential is and the predicted time horizon.
  • Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. Chapter II, Section D.4 of the Grant Proposal Guide provides additional information on collaborative proposals.
  • Proposal Titles: Proposal titles must begin with an acronym that indicates the categories in which proposals are being submitted, as follows Exploration Projects - EXP Design and Implementation Projects - DIP Integration and Deployment Projects - INDP
  • The acronym should be followed with a colon then the title of the proposed project. If you submit a proposal as one in a set of collaborative proposals, the title of your proposal should begin with the acronym that indicates the project category, followed by a colon, then "Collaborative Research" followed by a colon, and then the project title. For example, if you are submitting an Exploration Project, the title of each collaborative proposal would be EXP:Collaborative Research: Project Title.   Project Summary:  The Project Summary must include an explicit description of both the Intellectual Merit and Broader Impacts of the activities proposed, preferably in separate paragraphs titled "Intellectual Merit" and "Broader Impacts".   
  • Project Description: Project Descriptions should include the following sections: Vision and Goals. Describe: The theories of learning investigators are drawing from. Learning objectives: what learners are expected to learn and how the proposed innovation or its integration into the learning environment is expected to promote that learning. The population of learners, including any needs, abilities or interests relevant to achieving the learning objectives. How the proposed innovation is matched to the needs, abilities, and interests of targeted learners. Because deep understanding and facile capabilities emerge only over long periods of time, how the proposed innovation or its integration into some learning environment is expected to sustain engagement.
  • Research Plan and Outcomes. With appropriate references to the literature, describe the research questions to be answered through your research and a comprehensive research plan to answer them. Make clear the learning domain to be explored (e.g., content, subject matter, topics, skills, practices), and make a research-based case for the promise of the particular technological innovation for promoting learning in this domain. Describe the data to be gathered and analytic approaches to be taken to analyze the data.   It is anticipated that technological innovations will be iteratively refined over the course of the project based on analysis of formative data.  Describe the formative evaluation methodology you will use, including means to assess learning and engagement.  Describe the project outcomes you expect to generate, including products. Discuss how you will collect and analyze data to supply evidence of learning outcomes. Innovation Outcomes (For DIP and INDP projects ONLY). Describe how the proposed innovations and ways of integrating them into the learning environment take into account the environmental and human factors important to learner success (e.g., the cognitive, developmental, affective, and social needs of learners, the cultural milieu in which the learning technologies will be used, and the capabilities and expectations of human agents in the environment).  All claims about the appropriateness of the proposed innovation should be supported with evidence from the literature.
  • A Collaboration Plan is required for all proposals involving multiple investigators. The length of and degree of detail provided in the Collaboration Plan should be commensurate with the complexity of the proposed project.  Collaboration Plans should be included at the end of the Project Description in a section entitled "Collaboration Plan", and up to 3 additional pages are allowed for Collaboration Plans. The Collaboration Plan should describe: the specific roles of the project participants in all organizations involved; information on how the project will be managed across all the investigators, institutions, and/or disciplines; identification of the specific coordination mechanisms that will enable cross-investigator, cross-institution, and/or cross-discipline scientific integration (e.g., yearly workshops, graduate student exchange, project meetings at conferences, use of videoconferencing resources or social media technologies, software repositories, etc.); and specific references to budget line items that support collaboration and coordination mechanisms.
  • What is the intellectual merit of the proposed activity? How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields? How well qualified is the proposer (individual or team) to conduct the project? (If appropriate, the reviewer will comment on the quality of the prior work.) To what extent does the proposed activity suggest and explore creative, original, or potentially transformative concepts? How well conceived and organized is the proposed activity? Is there sufficient access to resources? What are the broader impacts of the proposed activity? How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?
  •  
    Checklist of sections to be addressed in the proposal
dhtobey Tobey

Emulab.Net - Emulab - Network Emulation Testbed Home - 0 views

  • Emulab is a network testbed, giving researchers a wide range of environments in which to develop, debug, and evaluate their systems. The name Emulab refers both to a facility and to a software system. The primary Emulab installation is run by the Flux Group, part of the School of Computing at the University of Utah. There are also installations of the Emulab software at more than two dozen sites around the world, ranging from testbeds with a handful of nodes up to testbeds with hundreds of nodes.
  •  
    "Emulab is a network testbed, giving researchers a wide range of environments in which to develop, debug, and evaluate their systems. The name Emulab refers both to a facility and to a software system. The primary Emulab installation is run by the Flux Group, part of the School of Computing at the University of Utah. There are also installations of the Emulab software at more than two dozen sites around the world, ranging from testbeds with a handful of nodes up to testbeds with hundreds of nodes."
  •  
    Possible testbed for developing performance-based exams and simulated learning platforms. Emulab underlies DHS' DETER testbed for research and development.
dhtobey Tobey

2011 DHS S&T CSRD BAA - Federal Business Opportunities: Opportunities - 0 views

  • The Department of Homeland Security (DHS) Science and Technology (S&T) Homeland Security Advanced Research Projects Agency (HSARPA) Cyber Security Division's (CSD) announce a Broad Agency Announcement (BAA) for Fiscal Year 2011 to improve the security in both Federal networks and the larger Internet. This Broad Agency Announcement (BAA) seeks ideas and proposals for Research and Development (R&D) in 14 Technical Topic Areas (TTAs) related to CSD.
Steve King

Services | SkillsNET - 0 views

  •  
    Here at SkillsNET, we offer a fully implemented, semantically indexed, knowledge management system, used to facilitate research collaboration, information access, and interoperability amongst workers. Utilizing our Semantic Workforce Analysis methods, organizations can realize the benefits of identifying domain ontologies, and can significantly improve their knowledge management systems (KMS) strategy internally and among distributed web communities. Using Latent Semantic Analysis (LSA), electronic artifacts and explicit Knowledge data are analyzed, decomposed, and meta-tagged for later retrieval during a problem solving scenario. Your workers will have the ability to access explicit knowledge sources and use tacit Knowledge in collaboration with team members to identify the optimum technical solution to current problems
dhtobey Tobey

AA-ISP selects BrainX On-line System for CISP® Accreditation Program | Press ... - 0 views

  • We felt a need to both identify and then test the set of competencies and skills required at the individual rep level”, stated Reeves. AA-ISP Founder and CEO, Bob Perkins goes on to note, “For years corporations have made significant investments in hiring, training, and on-boarding inside sales professionals. Yet there remains a need to, quantify, measure, and then test these skills to assure an individual was competent. We selected BrainX [www.BrainX.com] because their unique on-line learning system. If a salesperson doesn’t meet the required knowledge and skill level the BrainX on-line learning system builds a personal set of Sales Courses and simulations to help the salesperson master the required knowledge and skills so they can pass the accreditation requirements.
  • Bruce Lewolt the CEO of BrainX says he is proud that an organization that understands the world of sales training as well as the AA-ISP would recognize the value of personalized sales training that is mastery based
  • Lewolt agrees and added that it is only with this level of mastery that the conscious centers of the brain are freed up to really listen to what the customer is saying well enough to be able to read between the lines and figure out what it really driving the customer
  • ...2 more annotations...
  • The BrainX On-line Learning system is an intelligent system that uses a series of knowledge, skill and belief assessments and builds a cognitive learning profile on each learner. The BrainX Digital Tutor uses the first set of assessments to figure out just what each sales person should focus on and combines this with the personal cognitive profile to mold the delivery of the lesson content and simulations in a way that helps each individual master the required set of skills and competencies in the shortest amount of time so they can obtain their CISP®.
  • About BrainX BrainX is the next generation of online learning and Talent Management! BrainX is the first system to combine patented intelligent learning technology with mastery-based, learning strategies. BrainX participated in the landmark research on the neurobiology of effective sales training and used this research to design the BrainX system. The result is a system that accepts any type of content (e.g. product training, sales training, negotiation training) and stores the content in a way that allows the BrainX Digital Tutor to understand the content. The BrainX system figures out just what each person already knows along with what they need to learn. The system uses this information along with the understanding of the lesson content to build personalized lessons that teach each person just what they need: to know; to be able to do; and to believe about why something needs to be done in the correct way. With BrainX the days of one size fits all sales training courses are gone forever. The BrainX system also builds a cognitive learning profile on each learner and uses this information to customize the way lessons are taught and to determine how much post lesson reinforcement each sales person needs. This approach is so effective that when compared to traditional online learning, BrainX users achieve mastery in 50% less time. www.BrainX.com
  •  
    Brain-X appears to be a system worth investigating as we develop tools to support assessment-based development.
dhtobey Tobey

Home - Performance Testing Council - 0 views

  • The Performance Testing Council is your gateway to freely exchange experiences, knowledge, and yes, passion with others in the practice of performance testing. Membership will help you refine your evaluation program as you learn from experts, share best practices, help define research, expand your marketplace and help establish common delivery standards.
  •  
    Community of interest group for performance testing
dhtobey Tobey

Outgunned: How Security Tech Is Failing Us -- InformationWeek - 0 views

  • "Years ago when we started writing checks, we might have been tackling five to 10 a day," says Paul Wood, a senior analyst with Symantec Hosted Services. "It's now well over 10,000 a day and growing." According to McAfee's 2010 Q2 Threat Report, the company identified 10 million pieces of malware in the first half of this year and is tracking close to 45 million in its malware database.
  • Vulnerability assessment products are also behind the curve, as Greg Ose and Patrick Toomey, both Neohapsis application security consultants, found when they recently set out to measure the relative effectiveness of various vulnerability scanners. "It's a question frequently raised by our customers," Toomey says. "They know the tools aren't going to catch all of the problems, but can they count on them to catch, say, 80% of the bad ones?" What Ose and Toomey discovered was far worse than even they had anticipated. Out of the 1,404 vulnerabilities accounted for by the Common Vulnerabilities and Exposures project during the sample period, there were only 371 signatures. In the best cases, the tools were in the 20% to 30% effectiveness range.
  • Toomey's observations are in line with those of security researcher Larry Suto, who earlier this year reported that Web application vulnerability scanners missed almost half (49%) of the vulnerabilities present during his tests.
  • ...5 more annotations...
  • But there's also a new twist to consider: With an increased number of attackers targeting and hijacking the credentials of IT personnel, the outsider can become the insider, at least from the perspective of our technology controls. Forward-thinking companies will move now to address this scenario. Think about how you'll detect large, anomalous query spikes against key tables in sensitive databases. Ensure you can spot large-scale document downloads from file shares and internal document management systems. If a hijacked credential is used to log into a large number of machines during a short time frame, you should have the ability to spot that activity.
    • dhtobey Tobey
       
      Investing in workforce development and professionalizatino of the infosec workforce may do more.. combat ingenuity with ingenuity, not automation.
  • investing even a small percentage of your security budget in only a few specialized systems to help here will go further than throwing good money at yesterday's outdated controls.
  • Stop rewarding ineffectiveness and start rewarding innovation. Maybe right now you're struggling with a scary realization: "The millions I'm spending on firewalls and antivirus technology is relatively worthless if my adversary is skilled."
  • Greg Shipley is an InformationWeek contributor and a former CTO
dhtobey Tobey

Competency Data For Training Automation.pdf - 0 views

  •  
    Great white paper from which we borrowed the SCORM graphic.
1 - 8 of 8
Showing 20 items per page