Skip to main content

Home/ BI-TAGS/ Group items tagged process

Rss Feed Group items tagged

cezarovidiu

You Probably Need Parallel Except When You Don't - 0 views

  • f you are running a large Oracle data warehouse you should be using parallel
  • Like all tools you have to use parallel correctly; no more would we think of using a wrench to hammer a nail then should you think parallel is the answer to all performance problems. Sometimes parallel will make things worse, sometimes parallel will make performance less predictable.
  • Parallel introduces additional work to a query, simplistically we need to: split the query into multiple parallel processes, execute them, wait for the processes to complete and finally coordinate the results. This all takes time to do. Our time saving comes from being able to process multiple smaller chunks of data simultaneously. If the time to execute the step in parallel is not significantly faster than doing it without parallel then the additional overhead may make parallel processing a slower option; this is typically the case with small tables where a full tablescan or an indexed access is fast. Use too few parallel processes and we will not gain much in performance, too many and we risk starving the database of resource for other work or even slow our own process as it waits for resource. If you have implemented some form of CPU resource management on your system you may find that you experience delays as your parallel slaves ‘wait their turn’
cezarovidiu

What is business intelligence (BI)? - Definition from WhatIs.com - 0 views

  • Business intelligence is a data analysis process aimed at boosting business performance by helping corporate executives and other end users make more informed decisions.
  • Business intelligence (BI) is a technology-driven process for analyzing data and presenting actionable information to help corporate executives, business managers and other end users make more informed business decisions.
  • BI encompasses a variety of tools, applications and methodologies that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop and run queries against the data, and create reports, dashboards and data visualizations to make the analytical results available to corporate decision makers as well as operational workers.
  • ...9 more annotations...
  • The potential benefits of business intelligence programs include accelerating and improving decision making; optimizing internal business processes; increasing operational efficiency; driving new revenues; and gaining competitive advantages over business rivals. BI systems can also help companies identify market trends and spot business problems that need to be addressed.
  • BI data can include historical information, as well as new data gathered from source systems as it is generated, enabling BI analysis to support both strategic and tactical decision-making processes.
  • BI programs can also incorporate forms of advanced analytics, such as data mining, predictive analytics, text mining, statistical analysis and big data analytics.
  • In many cases though, advanced analytics projects are conducted and managed by separate teams of data scientists, statisticians, predictive modelers and other skilled analytics professionals, while BI teams oversee more straightforward querying and analysis of business data.
  • Business intelligence data typically is stored in a data warehouse or smaller data marts that hold subsets of a company's information. In addition, Hadoop systems are increasingly being used within BI architectures as repositories or landing pads for BI and analytics data, especially for unstructured data, log files, sensor data and other types of big data. Before it's used in BI applications, raw data from different source systems must be integrated, consolidated and cleansed using data integration and data quality tools to ensure that users are analyzing accurate and consistent information.
  • In addition to BI managers, business intelligence teams generally include a mix of BI architects, BI developers, business analysts and data management professionals; business users often are also included to represent the business side and make sure its needs are met in the BI development process.
  • To help with that, a growing number of organizations are replacing traditional waterfall development with Agile BI and data warehousing approaches that use Agile software development techniques to break up BI projects into small chunks and deliver new functionality to end users on an incremental and iterative basis.
  • consultant Howard Dresner is credited with first proposing it in 1989 as an umbrella category for applying data analysis techniques to support business decision-making processes.
  • Business intelligence is sometimes used interchangeably with business analytics; in other cases, business analytics is used either more narrowly to refer to advanced data analytics or more broadly to include both BI and advanced analytics.
cezarovidiu

Magic Quadrant for Business Intelligence and Analytics Platforms - 0 views

  • Integration BI infrastructure: All tools in the platform use the same security, metadata, administration, portal integration, object model and query engine, and should share the same look and feel. Metadata management: Tools should leverage the same metadata, and the tools should provide a robust way to search, capture, store, reuse and publish metadata objects, such as dimensions, hierarchies, measures, performance metrics and report layout objects. Development tools: The platform should provide a set of programmatic and visual tools, coupled with a software developer's kit for creating analytic applications, integrating them into a business process, and/or embedding them in another application. Collaboration: Enables users to share and discuss information and analytic content, and/or to manage hierarchies and metrics via discussion threads, chat and annotations.
  • Information Delivery Reporting: Provides the ability to create formatted and interactive reports, with or without parameters, with highly scalable distribution and scheduling capabilities. Dashboards: Includes the ability to publish Web-based or mobile reports with intuitive interactive displays that indicate the state of a performance metric compared with a goal or target value. Increasingly, dashboards are used to disseminate real-time data from operational applications, or in conjunction with a complex-event processing engine. Ad hoc query: Enables users to ask their own questions of the data, without relying on IT to create a report. In particular, the tools must have a robust semantic layer to enable users to navigate available data sources. Microsoft Office integration: Sometimes, Microsoft Office (particularly Excel) acts as the reporting or analytics client. In these cases, it is vital that the tool provides integration with Microsoft Office, including support for document and presentation formats, formulas, data "refreshes" and pivot tables. Advanced integration includes cell locking and write-back. Search-based BI: Applies a search index to structured and unstructured data sources and maps them into a classification structure of dimensions and measures that users can easily navigate and explore using a search interface. Mobile BI: Enables organizations to deliver analytic content to mobile devices in a publishing and/or interactive mode, and takes advantage of the mobile client's location awareness.
  • Analysis Online analytical processing (OLAP): Enables users to analyze data with fast query and calculation performance, enabling a style of analysis known as "slicing and dicing." Users are able to navigate multidimensional drill paths. They also have the ability to write back values to a proprietary database for planning and "what if" modeling purposes. This capability could span a variety of data architectures (such as relational or multidimensional) and storage architectures (such as disk-based or in-memory). Interactive visualization: Gives users the ability to display numerous aspects of the data more efficiently by using interactive pictures and charts, instead of rows and columns. Predictive modeling and data mining: Enables organizations to classify categorical variables, and to estimate continuous variables using mathematical algorithms. Scorecards: These take the metrics displayed in a dashboard a step further by applying them to a strategy map that aligns key performance indicators (KPIs) with a strategic objective. Prescriptive modeling, simulation and optimization: Supports decision making by enabling organizations to select the correct value of a variable based on a set of constraints for deterministic processes, and by modeling outcomes for stochastic processes.
  • ...7 more annotations...
  • These capabilities enable organizations to build precise systems of classification and measurement to support decision making and improve performance. BI and analytic platforms enable companies to measure and improve the metrics that matter most to their businesses, such as sales, profits, costs, quality defects, safety incidents, customer satisfaction, on-time delivery and so on. BI and analytic platforms also enable organizations to classify the dimensions of their businesses — such as their customers, products and employees — with more granular precision. With these capabilities, marketers can better understand which customers are most likely to churn. HR managers can better understand which attributes to look for when recruiting top performers. Supply chain managers can better understand which inventory allocation levels will keep costs low without increasing out-of-stock incidents.
  • descriptive, diagnostic, predictive and prescriptive analytics
  • "descriptive"
  • diagnostic
  • data discovery vendors — such as QlikTech, Salient Management Company, Tableau Software and Tibco Spotfire — received more positive feedback than vendors offering OLAP cube and semantic-layer-based architectures.
  • Microsoft Excel users are often disaffected business BI users who are unable to conduct the analysis they want using enterprise, IT-centric tools. Since these users are the typical target users of data discovery tool vendors, Microsoft's aggressive plans to enhance Excel will likely pose an additional competitive threat beyond the mainstreaming and integration of data discovery features as part of the other leading, IT-centric enterprise platforms.
  • Building on the in-memory capabilities of PowerPivot in SQL Server 2012, Microsoft introduced a fully in-memory version of Microsoft Analysis Services cubes, based on the same data structure as PowerPivot, to address the needs of organizations that are turning to newer in-memory OLAP architectures over traditional, multidimensional OLAP architectures to support dynamic and interactive analysis of large datasets. Above-average performance ratings suggest that customers are happy with the in-memory improvements in SQL Server 2012 compared with SQL Server 2008 R2, which ranks below the survey average.
  •  
    "Gartner defines the business intelligence (BI) and analytics platform market as a software platform that delivers 15 capabilities across three categories: integration, information delivery and analysis."
cezarovidiu

2013 ERP research: Compelling advice for the CFO : Enterprise Irregulars - 0 views

  • ERP vendor selection. As the following graph shows, the primary candidates for ERP software were SAP, Oracle, Microsoft, Epicor, and Infor:
  • The cloud question. Despite the hype, only 14 percent of respondents are using ERP delivered as Software as a Service (SaaS). Although the best cloud vendors can deliver superior security and reliability than most internal IT departments, market momentum to ERP in the cloud is not there yet, as the following diagram illustrates:
  • Important lessons. Implementing an ERP system is always complex because the deployment drives changes to both data and processes that extend across departmental boundaries inside the organization.
  • ...4 more annotations...
  • Software projects aren’t just technical endeavors. They’re also political, financial, emotional, structural, strategic, process and people-centric initiatives. Ignoring any one of these dimensions is done at the project manager’s peril.
  • Today’s CFO must balance the demands of two competing forces: the extraordinary wave of innovation (and the process changes these bring) against the regulatory, control-driven forces who want every process, every exception, and device to be documented, controlled and secured. In recent years, CFOs have spent tens of billions of dollars (or more) with audit firms to document the control points and risks within their existing ERP solutions.
  • ERP can bring significant benefit but implementation requires careful attention to both business planning and technology activities. For this reason, achieving project success and business value demand that CFO and CIO work together as a collaborative unit.
  • Therefore, it is essential to create this partnership and show your entire organization that the business and technology teams can communicate, collaborate, and share knowledge on a systematic and consistent basis. This collaboration is the true underlying strategy for gaining maximum value from ERP or any other enterprise initiative.
cezarovidiu

Rittman Mead Consulting - The Changing World of Business Intelligence - 0 views

  • Schema on write This is the traditional approach for Business Intelligence. A model, often dimensional, is built as part of the design process. This model is an abstraction of the complexity of the underlying systems, put in business terms. The purpose of the model is to allow the business users to interrogate the data in a way they understand.
  • The model is instantiated through physical database tables and the date is loaded through an ETL (extract, transform and load) process that takes data from one or more source systems and transforms it to fit the model, then loads it into the model.
  • The key thing is that the model is determined before the data is finally written and the users are very much guided or driven by the model in how they query the data and what results they can get from the system. The designer must anticipate the queries and requests in advance of the user asking the questions.
  • ...3 more annotations...
  • Schema on read Schema on read works on a different principle and is more common in the Big Data world. The data is not transformed in any way when it is stored, the data store acts as a big bucket. The modelling of the data only occurs when the data is read. Map/Reduce is the clearest example, the mapping is the understanding of the data structure. Hadoop is a large distributed file system, which is very good at storing large volumes of data, this is potential. It is only the mapping of this data that provides value, this is done when the data is read, not written.
  • New World Order So whereas Business Intelligence used to always be driven by the model, the ETL process to populate the model and the reporting tool to query the model, there is now an approach where the data is collected its raw form, and advanced statistical or analytical tools are used to interrogate the data. An example of one such tool is R.
  • The driver for which approach to use is often driven by what the user wants to find out. If the question is clearly formed and the sources of data that are required to answer it well understood, for example how many units of a product have we sold, then the traditional schema on write approach is best.
cezarovidiu

Google Reader (250) - 0 views

  • What this means in practice is that when the BI Server component starts up, it creates and reserves a number of threads in advance, determined by a number of parameters including SERVER_THREAD_RANGE.
  • You can see these threads running and ready to perform tasks for the BI Server component by using a tool such as Process Explorer for Windows
  • Thinking it through a bit, any given single query is, to a certain extent, only really going to use a small part of the total amount of CPUs available on a server, because it’s not the BI Server that runs queries in parallel, it’s the underlying database. For example, a single analysis against a single Oracle Database datasource would only really need a single BI Server thread to handle the query request, but when the underlying database receives the query, it might use a large number of its CPUs to process the query, returning results back to the BI Server to then pass back to the Presentation Server for display to the user.
  • ...2 more annotations...
  • The BI Server wouldn’t have any use for any more query threads, as it can’t really do anything with them – the exception to this being queries that generate multiple physical SQLs, for example to join data from multiple sources together and return a single set of data to the user, for which the BI Server could benefit from a higher CPU count if each of these queries in turn led to lots of threads being used – but two queries, in themselves, don’t neccessarily require two CPUs, because of course the BI Server, and the underlying CPUs, are themselves multi-threaded.
  • To conclude then – all things begin equal, the BI Server should make use of all of the CPUs that the underlying operating system presents to it, with the OS itself deciding what threads are scheduled against which CPUs. In-theory, all CPUs on the server are available to each BI Server component, but each OS is different and it might be worth experimenting if you’re sure that certain CPUs aren’t being used – but this is most probably unlikely and the main reason you’d really consider vertical scale-out of BI Server components is for fault-tolerance, or if you’re using a 32-bit OS and each process can only see a subset of the total overall memory. And, bear in mind that however many CPUs the BI Server has available to it, for queries that send just a single SQL statement down to the underlying database server, adding more CPUs or faster CPUs isn’t going to help as only a single (or so) thread will be needed to send the query from the BI Server to the database, and it’s the database that’s doing all of the work – all that this would help with is compilation and post-aggregation work, and enabling the server to handle a higher number of concurrent users. Invest in a better underlying database instead, sort out your data model, and make sure your data source back-end is as optimised as possible.
cezarovidiu

Difference between CRM lead and an opportunity - Pipeliner CRM Blog - 0 views

  • Any individual fish or pod of fish in your sea represents one lead.
  • Your Nemo will not be the first or the second fish that you catch. At the beginning, you will have very little information about the Nemo you would like to catch. You will start to examine your fish and create some criteria as to how Nemo should look like. In other words, you are qualifying your fish.
  • Lead = Any Fish in The Sea. Opportunity = Nemo
  • ...6 more annotations...
  • The process of examination and adding the criteria represents your sales pipeline strategy. It’s always true that: “Without a commitment to pursue working together (something that results in this company potentially buying from you) there is no opportunity.” - Anthony Iannarino
  • At the end of your examination ie. of your sales process, you will either let the fish swim back into your sea (lost opportunity) or you will put Nemo into your aquarium (won opportunity). Won Opportunity = You have found Nemo Lost Opportunity = You have not found Nemo
  • A Lead – is a contact or an account with very little information. It could be just a person who you might have met at a conference. You will need to retrieve more information regarding this lead in order to create (qualify) an opportunity in your sales pipeline.
  • A old sales rule says: “If you have never contacted your contact, it’s a lead.”
  • An Opportunity - is a contact or an account which has been qualified. This person has entered into your buying cycle and is committed to working with you. You have already contacted, called or met him and know their needs or requirements. The old sales rule says: “The opportunity is a deal that you have the possibility to close!”
  • “Think about the difference between a lead and an opportunity as an evolving process i.e. each lead needs to be qualified to an opportunity. There will always be plenty of leads in your sales territory, but only few of them will qualify to become real sales opportunity.”
cezarovidiu

Oracle BI Blog - EPM, Business Intelligence, and OBIEE: OBIEE 11g, Setup Client DSN for... - 0 views

  •  
    "After installing the OBI 11g client tools each OBI developer or administrator will need to access the Oracle BI RPD using the OBI Administration Tool. The Administration Tool is the GUI that connects to the Oracle BI Server RPD in Online mode (or on the network in offline mode) allowing development and administration functionality of the RPD. The informal video below highlights the process in which to create an ODBC data source connection to the Oracle BI server and test that the connectivity is working."
cezarovidiu

BI Brief - Four Legs of a Successful Business Intelligence (BI) Project Team - 0 views

  • 1. Project Sponsorship and Governance 2. Project Management 3. Development Team (Core Team) 4. Extended Project Team
  • 1. Project Sponsorship and Governance IT and the business should form a BI steering committee to sponsor and govern design, development, deployment, and ongoing support. It needs both the CIO and a business executive, such as CFO, COO, or a senior VP of marketing/sales to commit budget, time, and resources. The business sponsor needs the project to succeed. The CIO is committed to what is being built and how.
  • 2. Project Management Project management includes managing daily tasks, reporting status, and communicating to the extended project team, steering committee, and affected business users. The project management team needs extensive business knowledge, BI expertise, DW architecture background, and people management, project management, and communications skills. The project management team includes three functions or members: Project development manager - Responsible for deliverables, managing team resources, monitoring tasks, reporting status, and communications. Requires a hands-on IT manager with a background in iterative development. Must understand the changes caused by this approach and the impact on the business, project resources, schedule and the trade-offs. Business advisor - Works within the sponsoring business organization. Responsible for the deliverables of the business resources on the project's extended team. Serves as the business advocate on the project team and the project advocate within the business community. Often, the business advocate is a project co-manager who defers to the IT project manager the daily IT tasks but oversees the budget and business deliverables. BI/DW project advisor - Has enough expertise with architectures and technologies to guides the project team on their use. Ensures that architecture, data models, databases, ETL code, and BI tools are all being used effectively and conform to best practices and standards.
  • ...2 more annotations...
  • 3. Development Team (Core Team) The core project team is divided into four sub-teams: Business requirements - This sub-team may have business people who understand IT systems, or IT people who understand the business. In either case, the team represents the business and their interests. They are responsible for gathering and prioritizing business needs; translating them into IT systems requirements; interacting with the business on the data quality and completeness; and ensuring the business provides feedback on how well the solutions generated meet their needs. BI architecture - Develops the overall BI architecture, selects the appropriate technology, creates the data models, maps the overall data workflow from source systems to BI analytics, and oversees the ETL and BI development teams from a technical perspective. ETL development - Receives the business and data requirements, as well as the target data models to be used by BI analytics. Develops the ETL code needed to gather data from the appropriate source systems into the BI databases. Often, a system analyst who is a expert in the source systems such as SAP is part of the team to provide knowledge of the data sources, customizations, and data quality. BI development - Create the reports or analytics that the business users will interact with to do their jobs. This is often a very iterative process and requires much interaction with the business users.
  • 4. Extended Project Team There are several functions required by the project team that are often accomplished through an "extended" team: Players - A group of business users are signed up to "play with" or test the BI analytics and reports as they are developed to provide feedback to the core development team. This is a virtual team that gets together at specific periods of the project but they are committed to this role during those periods. Testers - A group of resources are gathered, similarly to the virtual team above, to perform more extensive QA testing of the BI analytics, ETL processes, and overall systems testing. You may have project members test other members' work, such as the ETL team test the BI analytics and visa versa. Operators - IT operations is often separated from the development team but it is critical that they are involved from the beginning of the project to ensure that the systems are developed and deployed within your company's infrastructure. Key functions are database administration, systems administration, and networks. In addition, this extended team may also include help desk and training resources if they are usually provided outside of development.
cezarovidiu

SugarCRM - Install Settings - 0 views

  • memory_limit - Recommended setting: 512M or higher. The memory_limit parameter mainly comes into play when executing large transactions such as mass update, export and import. If this setting is too low when trying to perform one of these actions, the end user will encounter a fatal error and the process will not complete. upload_max_filesize and post_max_size - Recommended setting: 30M or higher. Both of these settings work in conjunction with each other when uploading files through SugarCRM which includes future upgrades as well as document and note attachments. Please note that there is also a setting in the application which can limit file upload file size for end users so the settings in PHP should be high enough to allow any future upgrade files to be loaded without error. max_execution_time = Recommended setting: 300. This setting controls how long a PHP process will remain active. It is important to set this parameter to a value that will allow for large requests to complete if necessary but also will not hamper performance of the server if running too long.
  • In regards to PHP setup, the following parameters should be set with values as indicated:
  • Maximum upload size Admin > System Settings 30000000 (~30 MB) The 'Maximum upload size' controls the maximum file size your users can upload into Sugar. This setting should not exceed the post_max_size and upload_max_filesize parameters in your PHP configuration.
  •  
    "memory_limit"
cezarovidiu

Magic Quadrant for Sales Force Automation - 0 views

  •  
    "Market Definition/Description Sales force automation (SFA) applications support the automation of sales activities, processes and administrative responsibilities for B2B organizations' sales professionals. Core functionalities include account, contact and opportunity management. Additional add-on capabilities focus on improving the sales effectiveness of salespeople, such as sales configuration, guided selling, proposal generation and content management, and sales performance management support, including incentive compensation, quota, sales coaching and territory management."
cezarovidiu

- Process Type Plugin - SaveToDisk - 0 views

  • Save file to file system rather than into table
  •  
    Save file to file system rather than into table
cezarovidiu

Why BI projects fail -- and how to succeed instead | InfoWorld - 0 views

  • A successful initiative starts with a good strategy, and a good strategy starts with identifying the business need.
  • The balanced scorecard is one popular methodology for linking strategy, technology, and performance management. Other methodologies, such as applied information economics, combine statistical analysis, portfolio theory, and decision science in order to help firms calculate the economic value of better information. Whether you use a published methodology or develop your own approach in-house, the important point is to make sure your BI activities are keyed to generating real business value, not merely creating pretty, but useless, dashboards and reports.
  • Next, ask: What data do we wish we had and how would that lead to different decisions? The answers to these questions form top-level requirements for any BI project.
  • ...10 more annotations...
  • Instead a team of data experts, data analysts, and business experts must come together with the right technical expertise. This usually means bringing in outside help, though that help needs to be able to talk to management and talk tech.
  • Nothing makes an IT department more nervous than asking for a feed to a key operational system. Moreover, a lot of BI tools are resource hungry. Your requirements should dictate what, how much, and how often (that is, how “real time” you need it to be) data must be fed into your data warehousing technology.
  • In other words, you need one big feed to serve all instead of hundreds of operational, system-killing little feeds that can’t be controlled easily.
  • You'll probably need more than one tool to suit all of your use cases.
  • You did your homework, identified the use cases, picked a good team, started a data integration project, and chose the right tools.
  • Now comes the hard part: changing your business and your decisions based on the data and the reports. Managers, like other human beings, resist change.
  • oreover, BI projects shouldn't have a fixed beginning and end -- this isn't a sprint to become “data driven.”
  • A process is needed
  • and find new opportunities in the data.
  • Here's the bottom line, in a handy do's-and-don'ts format: Don’t simply run a tool-choice project Do cherry-pick the right team Do integrate the data so that it can be queried performance-wise without bringing down the house Don’t merely pick a tool -- pick the right tools for all your requirements and use cases Do let the data change your decision making and the structure of your organization itself if necessary Do have a process to weed out useless analytics and find new ones
cezarovidiu

BI-ul se democratizeaza la nivel operational - 0 views

  • Practic, avem de-a face cu coborârea din sferele abstracte a BI-ului tradiţional către „enterprise intelligence“; o formă de „democratizare“ a BI-ului a devenit accesibilă maselor de utilizatori finali, pe baza ideii că instrumentele specifice acestui concept (analiză, raportare, semnalare etc.) trebuie să permită şi să ofere suport pentru luarea deciziilor în timp real.
  • Potrivit specialiştilor, „mutaţia“ menţionată reprezintă o evoluţie naturală, realizată sub presiunea pieţei, care impune luarea tot mai rapidă a unor decizii din ce în ce mai complexe la nivelul managementului operaţional în mod cotidian. Este vorba, practic, de o reorientare a conceptului de BI, de la tradiţionalul „data-centric“ spre mai pragmaticul „process-centric“, menit să permită un răspuns mai agil la provocările din piaţă.
  • Astfel, aplicaţiile de BI operaţional nu mai sunt rezervate doar analiştilor de business din top management, ci sunt accesibile şi directorilor executivi, managerilor şi utilizatorilor finali cu putere decizională. Prin intermediul acestui nou concept, managerii departamentelor de vânzări şi staff-urile din centrele de suport beneficaiză de informaţii relaţionate cu lista de activităţi zilnice şi de workflow-uri şi ghiduri de analiză, care îi ajută să interpreteze şi să analizeze informaţiile pe baza cărora trebuie să ia deciziile.
  • ...3 more annotations...
  • Astfel, potrivit rezultatelor finale, 66% dintre respondenţii studiului realizat de Ventana au indicat faptul că cel mai important câştig obţinut la nivelul întregului business în urma implementării unor soluţii de BI operaţional în cadrul companiilor lor este în creştere în eficienţă la toate nivelurile. (În completare, 60% dintre subiecţi au indicat faptul că îmbunătăţirea serviciilor oferite clienţilor reprezintă principala prioritate urmărită prin dezvoltarea aplicaţiilor de BI operaţional.) Alţi 53% consideră ca principal beneficiu faptul că au realizat reduceri importante de costuri, în timp ce 48% creditează orientarea spre abordarea operaţională a BI-ului drept principalul factor diferenţiator faţă de concurenţă.
  • Factorii diferenţiatori Rezultatele evidenţiate de studiul Ventana sună mai mult decât promiţător şi confirmă previziunile optimiste ale analiştilor privind creşterea pieţei pe această zonă, cotată cu o evoluţie chiar mai rapidă decât a pieţei aplicaţiilor de BI tradiţional. Pentru a evidenţia mai clar distincţia, iată punctele esenţiale în care abordarea operaţională diferă de cea tradiţională: audienţă, granularitate, timp de răspuns şi disponibilitate. Iată, pe scurt, fiecare parametru explicitat: Audienţa: plaja de utilizatori ai dezvoltărilor de BI operaţional include angajaţi implicaţi în activităţi operaţionale (agenţi de vânzări, personal tehnic, personal din contact centere etc.), care trebuie să ia rapid decizii cu impact semnificativ la acel nivel, dar şi manageri care trebuie să urmărească în mod curent indicatorii de performanţă operaţionali pe anumite niveluri. În cazul în care compania ce a implementat o dezvoltare de BI operaţional a reuşit să stabilească o corelare clară între indicatorii de performanţă strategici (Key Performance Indicators) şi metricile din plan operaţional, audienţa include şi persoane din senior management, care pot investiga în adâncime modul în care sunt respectate direcţiile strategice stabilite. Concluzia - audienţa aplicaţiilor de BI operaţional este mult mai mare decât în BI-ul tradiţional.
  • Timpul de răspuns: intervalul de răspuns pentru aplicaţiile de BI operaţional este semnificativ mai mic decât în BI-ul tradiţional. Cele mai multe module operaţionale necesită date al căror „grad de prospeţime“ poate varia de la câteva secunde la câteva minute. Acest fapt impune condiţii speciale în ceea ce priveşte furnizarea datelor în timp real, pentru că sunt necesare în luarea deciziilor în procesele operaţionale, care necesită un timp scurt de reacţie. Granularitate: spre deosebire de soluţiile de BI tradiţional care agregă date pentru a furniza o perspectivă ideală asupra performanţelor companiei, aplicaţiile de BI operaţional necesită un nivel mult mai mare de granularitate al datelor pentru a adresa nevoile specifice la nivel operational. (Nu este valabil însă în cazul tuturor aplicaţiilor de „operational BI“ - anumite date necesită date agregate provenind din data warehouse. Exemplul cel mai uzitat: parametrul „customer lifetime value“ utilizat de agenţii din contact center.) Disponibilitate: Aplicaţiile de BI operaţional sunt menite să furnizeze suport direct proceselor tranzacţionale de business sau de suport. Ceea ce înseamnă că perioada de inactivitate a acestor aplicaţii afectează direct abilitatea companiei de a încheia tranzacţii şi de a oferi suport clienţilor. Consecinţa logică – aplicaţiile trebuie să prezinte un grad ridicat de anduranţă.
cezarovidiu

What Skills Does an Oracle BI Developer Need in 2011? - 0 views

  • OBIEE 11g skills, both in terms of new functionality (mapping, analyses, KPIs and Scorecards etc) and new infrastructure (WebLogic, EM, OPSS etc) A smattering of Essbase skills, focused mainly on the integration with OBIEE and Essbase (and the many workarounds and gotchas) Good ODI skills, both in terms of the basics, but also being able to write knowledge modules, integrate with OBIEE, deployment and migration Solid database skills – OBIEE gave the illusion through aggregates etc that database tuning was redundant, but time has shown it’s by far the biggest success factor in a project – get the database design and optimisation wrong, and your project is toast. You need to know partitioning, materialized views, index types, and increasingly, you need to get yourself on an Exadata project as customers are buying the technology but you can’t teach it to yourself at home BI Apps skills, but watch out for everything changing when BI Apps 11g comes out, and be prepared to learn the Fusion Apps and JDeveloper if you want to stay in the game Looking to the future, keep an eye on technologies such as in-memory (TimesTen), mid-tier caching (Coherence), plus technologies such as Business Activity Monitoring (BAM), “big data” (Hadoop, large data sets, NoSQL), complex event processing and maybe products such as Qlikview, just in case Oracle buys them, or at least to know what the competition are up to, or more importantly pitching to your boss
  • The other thing to bear in mind of course, if you’re an Oracle BI developer, is that you need to have great business, communication and data modeling skills.
cezarovidiu

Filling a Critical Role in Business Today: The Data Translator - Microsoft Business Int... - 0 views

  • a lot of articles calling data scientists and statisticians the jobs of the future
  • there are more immediate needs that, when addressed, will have a much greater business impact.
  • Right now we have huge opportunities to make the data more accessible, more “joinable” and more consumable. Leaders don’t want more data – they want more information they can use to run their businesses.
  • ...5 more annotations...
  • Every company has hundreds of millions of records about their sales, expenses, employees and so on, with dozens of insights yet to be discovered through simple comparison or triangulation of relevant data.
  • Why don’t we focus on this? I think because it’s very difficult to do – being successful in this “data translator” role requires a unique set of skills and knowledge, the combination of which I call the BASE skillset: Business understanding Ability to synthetize and simplify Storytelling skills Expertise in data visualization
  • Business Understanding This one seems obvious, but it doesn’t mean simply understanding the financials of a business. Rather, it means truly knowing the operational details, the incentives, the install base, market growth, penetration, the competition, etc. An analyst can’t just know the technical aspect of a report or the math behind the numbers, but what is truly driving a pattern in terms of product quality, competition, incentives and/or offerings. The best analysts are able to mathematically isolate the key levers of a trend and then suggest actions to react to or take advantage of those trends. Ability to Synthetize and Simplify This is, in my opinion, the most underrated and underappreciated skill. Combing through thousands of data points and netting out 3-4 key issues in under 10 minutes, and then communicating these to a group of execs with very different analytical skills, is truly difficult. The key is to make it simple but not simplistic, which means you still capture the complexity even as you get to the few core insights. It requires a very thorough effort to gather all the relevant information before categorizing, prioritizing and deciding if it is significant. After a while, you become an expert and can sniff things out quickly. At the same time, there is the danger of missing anomalies when you jump to conclusions based only on a summary look.
  • Storytelling Skills There are stages that should be followed when explaining complex ideas, something data translators are frequently expected to do. The best storytellers start by giving context and trying to couple the current discussion to something the audience already knows, ensuring the story is well structured and connected. We have to move from a “buffet style” business review with thousands of numbers packed in tables to a layered approach that will guide the audience to focus first on the most relevant messages, diving deeper only when necessary. Minto Pyramid Principles, which are built around a process for organizing thought and communication, are helpful in making sure you really focus on what is important and relevant, versus being obsessed in telling every fact. Expertise in Data Visualization I am glad to finally see so much focus on Information Visualization and I believe this is correlated to the explosion of data. Traditional methods of organizing data do not facilitate an intuitive understanding of key information points or trends. For instance, the two examples below contain data on car sales across the U.S. The first, an alphabetized list, is much less intuitive than the second, which shows those sales on a map in Power View. With Power View, right away you can identify the states with the highest sales: CA, FL, TX, NY. (Workbook available here)
  • There is no better way to see patterns or trends than data visualization, making expertise in this area – both technical and analytical – critical for data translators.
cezarovidiu

Hadoop Tutorial - YDN - 0 views

  • Hadoop is designed to efficiently process large volumes of information by connecting many commodity computers together to work in parallel. The theoretical 1000-CPU machine described earlier would cost a very large amount of money, far more than 1,000 single-CPU or 250 quad-core machines. Hadoop will tie these smaller and more reasonably priced machines together into a single cost-effective compute cluster.
cezarovidiu

Dancing and Wrestling with Oracle APEX: Apex and FusionCharts (or There be dragons at t... - 0 views

  • All of which led me to FusionCharts, which is a brilliant set of flash charts and widgets.
  • All I had to do was figure out how to integrate it into my app. First I had to write a function to extract the data I needed from my database and output it as correctly-formatted XML. That bit was easy so I won't bore you with it.
  • Next I uploaded the Flash (SWF) file for my chart into my workspace. (Tell me something: when you upload an image to your application using Apex's image uploader you refer to it by pointing at # APP_IMAGES#, so how do you think you'd refer to a file you've uploaded using Apex's file uploader? #APP_FILES#? Wrong! Illogically, all files uploaded into your application should be pointed at using the #APP_IMAGES# substitution string.)Finally, I created a dynamic PL/SQL content region outputting the necessary wrapper tags for my Flash movie (which I copied from the FusionCharts examples), pointing it to my uploaded swf file and feeding it the XML from my database function (which I call in "before regions" page process).
1 - 20 of 32 Next ›
Showing 20 items per page