Skip to main content

Home/ ODI Data Infrastructure, Institutions & Data Access Initiatives/ Group items tagged ONS

Rss Feed Group items tagged

Ben Snaith

Diamond_theFirstCut_pdf.pdf - 0 views

  • Currently, we are also unable to ascertain the extent to which our data sample is representative of the workforce it is trying to capture. Although we are reporting on 80,804 contributions from 5,904 contributors, the response rate is relatively low (24.3% of those invited to submit data). The low response rate and self-selecting nature of Diamond means there is the possibility of bias in the data we present here. We are taking this into account and will consider it as we undertake an equality analysis 1 of the system one year on.
  • Diamond collects: • Actual diversity data (across six protected characteristic groups) from individuals (contributors) who have a role in making television, whether on- or off-screen; and • Perceived diversity data (across the six protected characteristics) of the on-screen contributors (i.e. diversity characteristics as viewers might perceive them).
  • Diamond is collecting: • Actual diversity data (from those making and appearing on television, including freelancers) and Perceived diversity data (how the viewer might perceive those they see on television) • Data across six protected characteristic groups: gender, gender identity, age, ethnicity, sexual orientation, and disability 2 • Data from those making a significant contribution to a programme • Data from original programmes only, commissioned by the current five Diamond broadcasters for UK transmission • Data from programmes across all genres (although we do not currently report on news and sport) broadcast on a total of 30 channels across the five Diamond broadcasters.
  • ...2 more annotations...
  • Diamond diversity data is collected via the online platform Silvermouse, which is already used by many broadcasters and production companies to collect and manage other kinds of information about the programmes they make.
  • Diamond does not collect: • Data from programmes which have not been commissioned by the five Diamond broadcasters • Data on people working across broadcasting more generally, outside of production (in other words, our data are not overall workforce statistics ) • Data where it is impractical to do so and where relevant privacy notices cannot be given. (Diamond does not collect data from every person appearing on television as part of a crowd scene, for example.)
  •  
    This report provides an initial view of the data that has been collected and made available to CDN since Diamond went live on 15 August 2016.
Ben Snaith

Patterns of data institution that support people to steward data themselves, or become ... - 0 views

  • it enables people to contribute data about them to it and, on a case-by-case basis, people can choose to permit third parties to access that data. This is the pattern that many personal data stores and personal data management systems adopt in holding data and enabling users to unlock new apps and services that can plug into it. Health Bank enables people to upload their medical records and other information like wearable readings and scans to share with doctors or ‘loved ones’ to help manage their care; Japan’s accredited information banks might undertake a similar role. Other examples — such as Savvy and Datacoup — seem to be focused on sharing data with market research companies willing to offer a form of payment. Some digital identity services may also conform to this pattern.
  • it enables people to contribute data about them to it and, on a case-by-case basis, people can choose whether that data is shared with third parties as part of aggregate datasets. OpenHumans is an example that enables communities of people to share data for group studies and other activities. Owners of a MIDATA account can “actively contribute to medical research and clinical studies by granting selective access to their personal data”. The approach put forward by the European DECODE project would seem to support this type of individual buy-in to collective data sharing, in that case with a civic purpose. The concept of data unions advocated by Streamr seeks to create financial value for individuals by creating aggregate collections of data in this way. Although Salus Coop asks its users to “share and govern [their] data together.. to put it at the service of collective return”, it looks as though individuals can choose which uses to put it to.
  • it enables people to contribute data about them to it and decisions about what third parties can access aggregate datasets are taken collectively. As an example, The Good Data seeks to sell browsing data generated by its users “entirely on their members’ terms… [where] any member can participate in deciding these rules”. The members of the Holland Health Data Cooperative would similarly appear to “determine what happens to their data” collectively, as would drivers and other workers who contribute data about them to Workers Info Exchange.
  • ...6 more annotations...
  • it enables people to contribute data about them and defer authority to it to decide who can access the data. A high-profile proposal of this pattern comes in the form of ‘bottom-up data trusts’ — Mozilla Fellow Anouk Ruhaak has described scenarios where multiple people “hand over their data assets or data rights to a trustee”. Some personal data stores and personal information management systems will also operate under this kind of delegated authority within particular parameters or settings.
  • people entrust it to mediate their relationships with services that collect data about them. This is more related to decisions about data collection rather than decisions about access to existing data, but involves the stewardship of data nonetheless. For example, Tom Steinberg has described a scenario whereby “you would nominate a Personal Data Representative to make choices for you about which apps can do what with your data.. [it] could be a big internet company, it could be a church, it could be a trade union, or it could be a dedicated rights group like the Electronic Frontier Foundation”. Companies like Disconnect.Me and Jumbo are newer examples of this type of approach in practice.
  • it enables people to collect or create new data. Again, this pattern describes the collection rather than the re-use of existing data. For example, OpenBenches enables volunteers to contribute information about memorial benches, and OpenStreetMap does similar at much larger scale to collaboratively create and maintain a free map of the world. The ODI has published research into well-known collaboratively maintained datasets, including Wikidata, Wikipedia and MusicBrainz, and a library of related design patterns. I’ve included this pattern here as to me it represents a way for people to be directly involved in the stewardship of data, personal or not.
  • it collects data in providing a service to users and, on a case-by-case basis, users can share that data directly with third parties. This pattern enables users to unlock new services by sharing data about them (such as via Open Banking and other initiatives labelled as ‘data portability’), or to donate data for broader notions of good (such as Strava’s settings that enable its users to contribute data about them to aggregate datasets shared with cities for planning). I like IF’s catalogue of approaches for enabling people to permit access to data in this way, and its work to show how services can design for the fact that data is often about multiple people.
  • it collects data by providing a service to users and shares that data directly with third parties as provisioned for in its Terms and Conditions. This typically happens when we agree to Ts&Cs that allow data about us to be shared with third parties of an organisation’s choice, such as for advertising, and so might be considered a ‘dark’ pattern. However, some data collectors are beginning to do this for more public, educational or charitable purposes — such as Uber’s sharing of aggregations of data with cities via the SharedStreets initiative. Although the only real involvement we have here in stewarding data is in choosing to use the service, might we not begin to choose between services, in part, based on how well they act as data institutions?
  • I echo the point that Nesta recently made in their paper on ‘citizen-led data governance’, that “while it can be useful to assign labels to different approaches, in reality no clear-cut boundary exists between each of the models, and many of the models may overlap”
fionntan

Audits, External - Encyclopedia - Business Terms | Inc.com - 1 views

  • The auditor's unqualified report contains three paragraphs. The introductory paragraph identifies the financial statements audited, states that management is responsible for those statements, and asserts that the auditor is responsible for expressing an opinion on them. The scope paragraph describes what the auditor has done and specifically states that the auditor has examined the financial statements in accordance with generally accepted auditing standards and has performed appropriate tests. The opinion paragraph expresses the auditor's opinion (or formally announces his or her lack of opinion and why) on whether the statements are in accordance with generally accepted accounting principles.
  • Major types of audits conducted by external auditors include the financial statements audit, the operational audit, and the compliance audit. A financial statement audit (or attest audit) examines financial statements, records, and related operations to ascertain adherence to generally accepted accounting principles. An operational audit examines an organization's activities in order to assess performances and develop recommendations for improvements, or further action. Auditors perform statutory audits which are performed to comply with the requirements of a governing body, such as a federal, state, or city government or agency. A compliance audit has as its objective the determination of whether an organization is following established procedures or rules.
  • the auditor's final report to management often includes recommendations on methodologies of improving internal controls that are in place.
  • ...1 more annotation...
  • The primary goal of external auditing is to determine the extent to which the organization adheres to managerial policies, procedures, and requirements. The independent or external auditor is not an employee of the organization. He or she performs an examination with the objective of issuing a report containing an opinion on a client's financial statements. The attest function of external auditing refers to the auditor's expression of an opinion on a company's financial statements. The typical independent audit leads to an attestation regarding the fairness and dependability of the statements. This is communicated to the officials of the audited entity in the form of a written report accompanying the statements (an oral presentation of findings may sometimes be requested as well).
jeni10

UK Research and Development Roadmap - GOV.UK - 0 views

  • Another underutilised lever is procurement, in which government and public service providers can act as an early adopter and first customer for new technologies and ways of working.
  • Build on our innovation enabling infrastructure Innovation happens throughout the UK, but access to the right support and facilities is not consistently available. Wherever we have high-quality R&D infrastructure we need to take full advantage of this, by encouraging the creation of new innovation zones and clusters of innovative firms around existing infrastructure around the UK. We will consider the full range of levers in doing so, from direct funding, to business support, to government’s ability to convene different actors and promote new opportunities. We want to build on our Catapult Network’s existing performance, boosting the benefits the network brings to local economies and addressing major challenges such as net zero carbon emissions. We will review whether they should all continue in their current form, exploring the potential to seize new opportunities. We will consider the opportunities provided by PSREs and other publicly funded research institutes, including establishing how government can best drive innovation through these organisations, for example through proof of concept for innovations and better sharing of ideas. To support publicly funded research infrastructures to make the most of their innovations, we will establish a fund to invest in innovative public sector ideas and a new unit to scout for and develop these opportunities.
  • Taking forward our National Data Strategy, we will improve the access to trusted data resources at local and regional levels to improve the availability of evidence at those levels to give local leaders what they need to build robust R&D ecosystems.
  • ...4 more annotations...
  • Data that could drive new discoveries or innovation is not always as available as it could be.
  • make the most of PSREs, which have the potential to deliver broad public policy objectives and help innovation translation enable work across institutions to solve the grand challenges of our time make the most of our institutions, to use research to improve both UK and devolved policy outcomes and to measure and refine programme performance
  • Crucially, we must embrace the potential of open research practices. First, we will require that research outputs funded by the UK government are freely available to the taxpayer who funds research. Such open publication will also ensure that UK research is cited and built on all over the world. We will mandate open publication and strongly incentivise open data sharing where appropriate, so that reproducibility is enabled, and knowledge is shared and spread collaboratively. Second, we will ensure that more modern research outputs are recognised and rewarded. For example, we will ensure that digital software and datasets are properly recognised as research outputs, so that we can minimise efforts spent translating digital outputs into more traditional formats. Third, we will consider the case for new infrastructure to enable more effective sharing of knowledge between researchers and with industry to accelerate open innovation where possible.
  • PSREs and other publicly funded research institutes Public sector research establishments (PSREs) and other publicly funded institutes – including UKRI-funded institutes and institutes funded by the devolved administrations, are a diverse collection of bodies carrying out research. This research supports government objectives, including informing policy-making, statutory and regulatory functions and providing a national strategic resource in key areas of research. They can also provide emergency response services. They interact with businesses around a wide array of innovation-related functions. We want to get the most value out of the whole range of PSREs and publicly funded institutes, laboratories and campuses. The current PSRE and institute landscape is complex. There is an opportunity to raise awareness and support development of strategic national laboratory capability, develop closer relationships across the network of PSREs and institutes to address cross-cutting priorities and develop more consistent and co-ordinated, accessible funding for PSREs. Most programmes do not include funding for the full costs of overheads – this sometimes prevents our national laboratories from participating in UK government priority programmes without making a loss. A more flexible approach and funding a higher proportion of the economic costs would increase spending efficiency and encourage more effective investments and maximise their benefits. Building on the 2019 Science Capability Review, we will: champion the development of a truly strategic, national laboratory capability and identify opportunities to strengthen their capabilities and ability to collaborate, especially with the private sector, devolved administrations and local civic authorities work to understand current capacity and capability, including spare capability, and to ensure that national labs, PSREs and other publicly funded institutes are working together as part of business as usual rather than only in times of crisis explore the potential for all PSREs to have access to more funding opportunities from UKRI so that PSREs are viewed as national assets rather than the property of UK government departments
fionntan

Evolution of Auditing: From the Traditional Approach to the Future Audit - 3 views

shared by fionntan on 10 Jun 20 - No Cached
  • advances in information technology in conjunction with real-time approaches to conducting business are challenging the auditing profession.
  • emphasis has historically been placed on a periodic, backward-looking approach whereby key events and activities are often identified long after their occurrence or simply undetected. Given that recent developments and technologies facilitated a movement away from the historical paradigm and toward a more proactive approach, it is essential that auditors understand what the future audit entails and how they might begin to envision a logical progression to such a state
  • Furthermore, refinements of audit standards generally consisted of reactionary measures that occurred in response to significant negative business events.
    • fionntan
       
      audits, accounting seem to happen after things have gone wrong and new regulations created
  • ...5 more annotations...
  • As a result, the AICPA issued Statement on Auditing Procedure (SAP) No. 1 in October 1939 and it required that auditors inspect inventories and confirm receivables. Consequently, auditors became responsible for auditing the business entity itself rather than simply relying upon management verification routines.
  • First, in 1961 Felix Kaufman wrote Electronic Data Processing and Auditing. The book compares auditing around and through the computer. Historically, auditing around the computer entails traditional manual procedures in which the existence of automated equipment is ignored. As such, the computer is treated as a black box. In this context, auditors rely upon physical inputs to and outputs from automated devices and do not concern themselves with how processing actually occurs within the system(s). Conversely, auditing through the computer involves actual use of computer systems in testing both controls and transactions. Finally, auditing with the computer entails direct evaluation of computer software, hardware, and processes. Consequently, auditing through the computer or with the computer is able to provide a much higher level of assurance when contrasted with auditing around the computer.
  • Although some aspects of the traditional audit will continue to hold value, the audit of the future provides opportunities to increase the use of automated tools and remains a key for offering improved assurances relative to the responsible management and utilization of stakeholder assets.
  • As previously mentioned, basic CAATS contain capabilities to enhance audit effectiveness and efficiency. However, they do not operate on a 24/7 basis and therefore fail to construct a truly continuous auditing environment whereby exceptions and anomalies may be identified as they occur. Alternatively stated, they do not work with real-time or close to real-time data streams and, thus, are not able to address questionable events such as potential fraud or irregularities in an optimized fashion. Cangemi (2010) argues that, given the recent advances in business technologies, the continuing emphasis on the backward looking audit is simply an outdated philosophy. Instead, he believes that real-time solutions are needed.
  • Future audit approaches would likely require auditors, regulators, and standards setters to make significant adjustments. Such adjustments might include (1) changes in the timing and frequency of the audit, (2) increased education in technology and analytic methods, (3) adoption of full population examination instead of sampling, (4) re-examination of concepts such as materiality and independence, and (5) mandating the provisioning of the audit data standard. Auditors would need to possess substantial technical and analytical skills
Sonia Duarte

How do different communities create unique identifiers? - Lost Boy - 0 views

  • They play an important role, helping to publish, structure and link together data.
  • The simplest way to generate identifiers is by a serial number.
  • the Ordnance Survey TOID identifier is a serial number that looks like this: osgb1000006032892. UPRNs are similar.
  • ...15 more annotations...
  • Some serial numbering systems include built in error-checking to deal with copying errors, using a check digit.
  • The second way of providing unique identifiers is using a name or code.
  • These are typically still assigned by a central authority, sometimes known as a registration agency, but they are constructed in different ways.
  • Encoding information about geography and hierarchy within codes can be useful. It can make them easier to validate.
  • It also mean you can also manipulate them,
  • But encoding lots of information in identifiers also has its downsides. The main one being dealing with changes to administrative areas that mean the hierarchy has changed. Do you reassign all the identifiers?
  • some identifier systems look at reducing the burden on that central authority.
  • federated assignment. This is where the registration agency shares the work of assigning identifiers with other organisations.
  • Another approach to reducing dependence on, and coordination with a single registration agency, is to use what I’ll call “local assignment“.
  • A simplistic approach to local assignment is “block allocation“: handing out blocks of pregenerated identifiers to organisations which can locally assign them.
  • Here the registration agency still generates the identifiers, but the assignment of identifier to “thing” is done locally.
  • A more common approach is to use “prefix allocation“. In this approach the registration agency assigns individual organisations a prefix within the identifier system.
  • One challenge with prefix allocation is ensuring that the rules for locally assigned suffixes work in every context where the identifier needs to appear.
  • In distributed assignment of identifiers, anyone can create an identifier. Rather than requesting an identifier, or a prefix from a registration agency, these systems operate by agreeing rules for how unique identifiers can be constructed.
  • A hash based identifier takes some properties of the thing we want to identify and then use that to construct an identifier. 
Ben Snaith

Every day, we rely on digital infrastructure built by volunteers. What happens when it ... - 0 views

  • Free and public code grew in direct response to the perceived failings of expensive, proprietary commercial software. As a result, the heart of the problem with digital infrastructure is also part of what makes it so rich with potential: It is not centralized. There is no one person or entity deciding what’s needed and what’s not. There is also no one overseeing how digital infrastructure is implemented. And because the community of volunteers developing this infrastructure has a complicated relationship with what might be seen as a more traditional, or official, way of doing things, few digital infrastructure projects have a clear business model or source of revenue. Even projects that have grown to be used by millions of people tend to lack a cohesive structure and plan for sustaining the technology’s long-term development.
  • We need to start by educating people who are in positions to provide support. Many of them—from start-up engineers to government officials—don’t know enough about how digital infrastructure functions and what it requires, or are under the perception that public software doesn’t need support.
Ben Snaith

AnnualReport2018LandPortal.pdf - 0 views

shared by Ben Snaith on 01 Jun 20 - No Cached
  • We are proud to say that the Land Portal has become far and wide the world’s leading source of data and information on land, with more than 30,000 visits a month, a majority of which are from the Global South. The Land Portal now counts more than 760 land-related indicators from 45 datasets aggregated from trusted sources around the world. This great diversity of information feeds into a dozen thematic portfolios and more than 60 country portfolios that combine data with the latest news, relevant publications, organizations and more.
  • Land Portal is leading the way on the adoption of open data and making land governance information more accessible. We brought together the land governance community for the Partnership for Action workshop to set the stage for building an information ecosystem on land governance, resulting in an action plan on data collection, management and dissemination. The Land Portal’s approach to capacity building was further refined at a workshop in Pretoria, South Africa, which created a great deal of momentum to adopt open data practices in this country
  • We are grateful for the United Kingdom’s Department for International Development (DFID) steadfast support, as well as the support of the Omidyar Network. We are also thankful for the support of Food and Agriculture Organization of the United Nations (FAO), GIZ - German Cooperation and the collaboration of all of our partners, without which our work would not be possible
Ben Snaith

345725803-The-state-of-weather-data-infrastructure-white-paper.pdf - 1 views

  • From its early beginnings over 150 years ago, weather forecasting at the Met Office has been driven by data. Simple observations recorded and used to hand-plot synoptic charts have been exchanged for the billions of observations received and handled every day, mainly from satellites but also from weather stations, radar , ocean buoys, planes, shipping and the public.
  • The key stages of the weather data value chain are as follows: Ÿ Monitoring and observation of the weather and environment, e.g. by NMSs. Ÿ Numerical weather prediction (NWP) and climate modelling carried out by NMSs to create global, regional and limited area weather forecasts. Private companies are growing their presence in the market and challenging the traditional role of NMSs to provide forecasts to the public, by statistically blending data from NMS models to create their own forecast models, for example. Other companies providing data via online channels and/or apps include The Weather Company, Accuweather or the Climate Corporation. Ÿ Communication and dissemination of forecasts by news, NMS and media organisations like the BBC, Yahoo and Google, or within consumer-targeted mobile and web applications. Ÿ Decision making by individuals and businesses across a variety of sectors, which draws on weather data and reporting.
  • The core data asset of our global weather data infrastructure is observation data that captures a continuous record of weather and climate data around the world. This includes temperature, rainfall, wind speed and details of a host of other atmospheric, surface and marine conditions.
  • ...5 more annotations...
  • The collection of observation data is a global effort. The Global Observing System consists of around 11,000 ground-based monitoring stations supplemented with thousands of sensors installed on weather balloons, aircraft and ships. 3 Observations are also collected from a network of radar installations and satellite-based sensors. As we see later, the ‘official’ observation system is increasingly being supplemented with new sources of observation data from the Internet of Things (IoT).
  • Ensemble model forecasts aim to give an indication of the range of possible future states of the atmosphere and oceans (which are a significant driver of the weather in the atmosphere). This overcomes errors introduced by using imperfect measurement of initial starting conditions that are then amplified by the chaotic nature of the atmosphere. Increasing the number of forecast members over a global scale and at higher resolutions result in data volumes increasing exponentially .
  • Created in 1950, The World Meteorological Organisation (WMO) is made up of 191 member states and territories around the world. The WMO was founded on the principle that global coordination was necessary to reap the benefits of weather and climate data. This includes a commitment to weather data and products being freely exchanged around the world (Obasi, 1995).
  • While the WMO has a global outlook, its work is supplemented by regional meteorological organisations like the European Centre for Medium Range Weather Forecasts (ECMWF) and NMSs, such as the Met Office in the UK
  • There are increasing new sources of weather observation data. In recent years, services like Weather Underground and the Met Office’s Weather Observation Website have demonstrated the potential for people around the world to contribute weather observations about their local areas – using low-cost home weather stations and sensors, for example. But there is now potential for sensors in cities, homes, cars, cell towers and even mobile phones to contribute observational data that could also be fed into forecast models.
Sonia Duarte

How have National Statistical Institutes improved quality in the last 25 years? - IOS P... - 0 views

  • There are still major efforts needed to continuously improve. More focus needs to be put on measuring internal processes, costs, and components of quality other than accuracy. Documentation needs to be regularly updated, methods for incorporating Big Data developed, and flexibility improved so that adaptive methods based on paradata can be used.
  • it takes regular management involvement and procedures to be in place for it to succeed
  • Measurements are vital, but they are not the goal. This will require re-focusing on improving internal processes. It also implies recognizing the need to track costs as a component of quality.
  • ...1 more annotation...
  • While it will continue to be important for NSIs to increase their use of administrative data and the many sources of Big Data, these will rarely be able to be used as stand-alone sources. More often these sources will need to be combined with well-designed survey data to produce a blended, improved product.
  •  
    "Principles and Practices for a Federal Statistical Agency"
fionntan

Declaration of Conformity - Work equipment and machinery - 0 views

  • The precise requirements are specified in each relevant directive, but essentially Declarations of Conformity should include the following: business name and full address of the manufacturer and, where appropriate, his authorised representative; for machinery, the name and address of the person authorised to compile the technical file, who must be established in the Community; description and identification of the product, which may including information on model, type, and serial number a declaration as to which Directive(s) the product fulfils all relevant provisions where appropriate, a reference to the harmonised standards used, and to which conformity is declared where appropriate, the name, address and identification number of the notified body which carried out conformity assessment the identity and signature of the person empowered to draw up the declaration on behalf of the manufacturer or his authorised representative
  • When must a Declaration of Conformity be provided? For most new products the Declaration of Conformity must accompany the product through the supply chain to the end user.
  • This document should declare key information, including: the name and address of the organisation taking responsibility for the product a description of the product list which product safety Directives it complies with may include details of relevant standards used and be dated, and signed by a representative of the organisation placing it on the EU/EEA market.
diaszasz

Factual | Business Listings in Factual Data - 1 views

  •  
    Factual started out as an aggregator that allowed organisations to deposit point of interest data to create an aggregated set. Their original business model, IIRC, was around licensing that dataset, but contributors got free access or favourable terms. I've noticed that they've changed their model, so the work of contributing the data is done via "Trusted Data Contributors" who appear to take on the work and responsibility for vetting upstream contributions. https://www.factual.com/updatelisting/ Sharing because I think the evolution is interesting, as is the approach to certifying upstream contributions. Relevant to the certification/audit discussion. Similar issues with some of the alt data ecosystem too I expect.
  •  
    Some background on their early days in this 2012 podcast https://cloudofdata.com/2012/01/data-market-chat-tyler-bell-discusses-factual/
fionntan

UKAS : The Benefits - 1 views

shared by fionntan on 15 Jun 20 - No Cached
  • It provides the following benefits: Competitive advantage: accreditation provides independent assurance that your staff is competent. It can sets you apart from the competition, and enable you to compete with larger organisations. Market access: accreditation is specified by an increasing number of public and private sector organisations. UKAS accreditation is also recognised and accepted globally, therefore opening up opportunities overseas. Accreditation can highlight gaps in capability, thereby providing the opportunity for improved organisational efficiency and outputs There are a number of insurance brokers and underwriters that recognise accreditation as an important factor in assessing risk, and can therefore offer lower premiums.
  • Organisations can save time and money by selecting an accredited and therefore competent supplier.
  • Provide an alternative to Regulation whilst ensuring the reliability of activities that have the potential to impact on public confidence, health and safety or the environment.
  • ...1 more annotation...
  • Accreditation is a means of assessing, in the public interest, the technical competence and integrity of organisations offering evaluation services.Accreditation, with its many potential benefits for the quality of goods and in the provision of services throughout the supply chain, underpins practical applications of an increasingly wide range of activities across all sectors of the economy, from fishing to forestry, construction to communications. Independent research has confirmed that accreditation has a positive economic value of nearly £1bn on the UK economy each year. 
Sonia Duarte

UKSA-Business-Plan-April-2019-to-March-2022.pdf - 3 views

  • At this point in time the forecasts for the years beyond 2019/20 are under consideration as we continue to develop our future strategy and bid for the forthcoming Spending Review (2019). As stated previously in the plan 2019/20 is a key year for us in securing the funding required to achieve our ambitions.
  • we will have met our agreed financial targets as part of Spending Review (2015). We also remain broadly on track to deliver our target level of efficiencies over the Spending Review period as indicated in Figure 2 below.
  • Modernising Corporate Support (Efficient) Improving our oversight • Progress with corporate systems improvement projects – milestones in the delivery of automation, workforce planning, improvements to systems controls (new). Delivering value from our resources • Meeting our financial delegations – Budget/forecast accuracy. • Delivering our agreed benefits – Census benefits and ESTP benefits – track of deliveries (new). • Delivering our agreed efficiencies – over the SR15 period.
  • ...2 more annotations...
  • we are nearing the Census delivery date in 2021 with the funding required for Census and Data Collection Transformation activities increasing significantly over the next three financial years.
  • Table 1: High level Authority funding position 2019/20 – based on the Main Supply Estimate agreement with HM Treasury Funding Stream £’m2018/19 2019/20 Variance Resource SR15 Baseline164.85 156.95 (7.89) Income funded24.26 26.60 2.34 Census and Data Collection Transformation94.00 104.00 10.00 Bean HMT Contribution4.00 9.00 5.00 Trade Statistics-2.40 2.40 Pension -4.20 4.20 Budget Cover Transfers0.50 (0.22) (0.72) Subtotal Resource2 8 7.61 302.93 15.33 Capital 13.43 7. 0 0 (6.43) Depreciation 2 3 .10 21.30 (1.80) Income Target24.26 26.60 2.34 Annually Managed Expenditure (AME) (0.84)(1.00)(0.16)
fionntan

Closing the AI Accountability Gap: Defining an End-to-End Framework for Internal Algori... - 1 views

shared by fionntan on 08 Jun 20 - No Cached
  • In environmental studies, Lynch and Veland [45] introduced the concept of urgent governance, distinguishing between auditing for system reliability vs societal harm. For example, a power plant can be consistently productive while causing harm to the environment through pollution [42].
  • the organizations designing and deploying algorithms can through governance structures. Proposed standard ISO 37000 defines this structure as "the system by which the whole organization is directed, controlled and held accountable to achieve its core purpose over the long term.
  • he organizations designing and deploying algorithms can through governance structures. Proposed standard ISO 37000 defines this structure as "the system by which the whole organization is directed, controlled and held accountable to achieve its core purpose over the long term.
  • ...4 more annotations...
  • The IEEE standard for software development defines an audit as “an independent evaluation of conformance of software products and processes to applicable regulations, standards, guidelines, plans, specifications, and procedures”
  • This is a repeatedly observed phenomenon in tax compliance auditing, where several international surveys of tax compliance demonstrate that a fixed and vetted tax audit methodology is one of the most effective strategies to convince companies to respect audit results and pay their full taxes
  • Complex systems tend to drift toward unsafe conditions unless constant vigilance is maintained [42]. It is the sum of the tiny probabilities of individual events that matters in complex systems—if this grows without bound, the probability of catastrophe goes to one. The Borel-Cantelli Lemmas are formalizations of this statistical phenomenon [13],
  • Failure Modes and Effects Analysis (FMEA), methodical and systematic risk management approach that examines a proposed design or technology for foreseeable failures [72]. The main purpose of a FMEA is to define, identify and eliminate potential failures or problems in different products, designs, systems and services.
Ben Snaith

Actually, nonprofits don't spend enough money on overhead - Quartz - 0 views

  • Successful organizations require financial systems, information technology, volunteer management and sustainable revenue streams. Part of the myth of the nonprofit world is that somehow righteousness will ultimately triumph over limited planning, crappy systems and a general scarcity of resources. But that is not the way the world works.
Ben Snaith

Business models for sustainable research data repositories | OECD - 3 views

shared by Ben Snaith on 01 Jun 20 - No Cached
  • However, for the benefits of open science and open research data to be realised, these data need to be carefully and sustainably managed so that they can be understood and used by both present and future generations of researchers. Data repositories - based in local and national research institutions and international bodies - are where the long-term stewardship of research data takes place and hence they are the foundation of open science. Yet good data stewardship is costly and research budgets are limited. So, the development of sustainable business models for research data repositories needs to be a high priority in all countries.
  • The 47 data repositories analysed reported 95 revenue sources. Typically, repository business models combine structural or host funding with various forms of research and other contract-for-services funding, or funding from charges for access to related value-added services or facilities. A second popular combination is deposit-side funding combined with a mix of structural or host institutional funding, or with revenue from the provision of research, value-added, and other services.
  • Research data repositories themselves can take advantage of the underlying economic differences between research data, which exhibit public good characteristics, and value-adding services and facilities, which typically do not, to develop business models that support free and open data while charging some or all users for access to value-adding services or related facilities
  • ...12 more annotations...
  • Over the centuries, libraries, archives, and museums have shown the practical and policy advantages of preserving sources of knowledge for society. Research and other types of data constitute a relatively new subject that requires our serious attention. Although some research data repositories were founded in the 1960s and even earlier, the data that are now being generated have resulted in the establishment of many new repositories and related infrastructure. Societies need such repositories to ensure that the most useful or unique data are preserved over the long term.
  • First, there are substantial and positive efficiency impacts, not only reducing the cost of conducting research, but also enabling more research to be done, to the benefit of researchers, research organisations, their funders, and society more widely
  • substantial additional reuse of the stored data, with between 44% and 58% of surveyed users across the studies saying they could neither have created the data for themselves nor obtained them elsewhere.
  • While these studies tend to provide a snapshot of the repository's value, which can be affected by the scale, age and prominence of the data repository concerned, it is important to note that in most cases, data archives are appreciating rather than depreciating assets. Most of the economic impact is cumulative and it grows in value over time, whereas most infrastructure (such as ships or buildings) has a declining value as it ages. Like libraries, data collections become more valuable as they grow and the longer one invests in them, provided that the data remain accessible, usable, and used.
  • Openness of public information strengthens freedom and democratic institutions by empowering citizens, and supporting transparency of political decision-making and trust in governance. It is no coincidence that the most repressive regimes have the most secretive institutions and activities (Uhlir, 2004). Open factual datasets also enhance public decision-making from the national to the local levels (Nelson, 2011), and open data policies demonstrate confidence of leadership and generally can broaden the influence of governments (Uhlir and Schröder, 2007). Countries that may be lagging behind socioeconomically frequently can benefit even more from access to public data resources (NRC, 2012b, 2002).
  • The survey of repositories undertaken for this and the previous RDA-WDS study classified the principal research data repository revenue sources as follows: • Structural funding (i.e. central funding or contract from a research or infrastructure funder that is in the form of a longer-term, multi-year contract). We use the term “structural” to underline the difference between this and project funding. The research data repository is considered as a form of research infrastructure or as providing an ongoing service. Although the funding may be regularly reviewed, it is a form of funding that is substantively different to project funding.
  • Host institution funding and support (i.e. direct or indirect support from a host institution). Some research data repositories are hosted by a research performing institution, e.g. a university, and receive direct funding or indirect (but costed) support from their host. • Data deposit fees (i.e. in the form of annual contracts with depositing institutions or per-deposit fees). As indicated, this can take the form of a period contract or a charge per deposit. In either case, the cost is borne by the entity that wishes to ensure that the data are preserved and curated for the long term. • Access charges (i.e. charging for access to standard data or to value-added services and facilities). This covers charges of various sorts (e.g. contract or per-access charges) and can be levied either for standard data or value-added services. In all cases, the cost is borne by the entity that wishes to access and use the data. • Contract services or project funding (i.e. charges for contract services to other parties or for research contracts). This covers short-term contracts and projects for various activities not covered above (i.e. these are not contracts to deposit or access data, but cover other services that may be provided). Similarly, this category of funding is distinct from structural funding because, although it may come from a research or infrastructure funder, it is for specific, time- and objective-limited projects, rather than for ongoing services or infrastructure.
  • The 47 data repositories analysed reported 95 revenue sources, an average of two per repository. Twenty-four repositories reported funding from more than one source, and seven reported more than three revenue sources. Combining revenue sources is an important element in developing a sustainable research data infrastructure.
  • A large majority (more than 80%), said they would not be considering any revenue sources that are incompatible with the open data principle.
  • The stage of development of a repository, its institutional or disciplinary context, its scale, and level of federation are also important determinants of what might be a sustainable business model. Referring to the dynamic of the evolution of firms, some economists draw a human parallel, talking of the phases as births, deaths, and marriages (and sometimes divorces). All phases are needed and should be accommodated. Indeed, sometimes it may not be desirable, effective, or efficient for a repository to be sustainable - provided that the data can continue to be hosted elsewhere.
  • This is the situation facing research data repositories. To be sustainable, data repositories need to generate sufficient revenue to cover their costs, but setting a price above the marginal cost of copying and distribution will reduce net welfare
  • Actions needed to develop a successful research data repository business model include: • Understanding the lifecycle phase of the repository's development (e.g. the need for investment funding, development funding, ongoing operational funding, or transitional funding) • Identifying who the stakeholders are (e.g. data depositors, data users, research institutions, research funders, and policy makers) • Developing the product/service mix (e.g. basic data, value-added data, value-added services and related facilities, and contract and research services) • Understanding the cost drivers and matching revenue sources (e.g. scaling with demand for data ingest, data use, the development and provision of value-adding services or related facilities, research priorities, and policy mandates) • Identifying revenue sources (e.g. structural funding, host institutional funding, deposit-side charges, access charges, and value-added services or facilities charges ) • Making the value proposition to stakeholders (e.g. measuring impacts and making the research case, measuring value and making the economic case, informing, and educating) (Figure 6).
Sonia Duarte

Clinical Service Accreditation - HQIP - 0 views

  • CSA provides this quality improvement framework in three main ways: by supporting clinical services to develop new accreditation schemes by supporting existing clinical services accreditation schemes to attain professionally-led standards by demonstrating the impact of accreditation in achieving high quality clinical services
  • HQIP (on behalf of the Clinical Service Accreditation Sponsor Gorup), BSI and the United Kingdom Accreditation Service (UKAS) share a common purpose in improving the quality of healthcare services through standards and accreditation.
  • These resources provide professional bodies, CEOs, managers and clinical leaders with a framework for setting up and managing clinical accreditation schemes.
  • ...1 more annotation...
  • Development of standards for clinical service accreditation schemes Sharing and improving accreditation methodologies How to map clinical services into groupings for the development of accreditation schemes PAS 1616  – A generic framework of standards for accrediting clinical services from the British Standards Institute  Information management, data, and systems Support for development of accreditation schemes
1 - 20 of 24 Next ›
Showing 20 items per page