Skip to main content

Home/ BI-TAGS/ Group items tagged productivity

Rss Feed Group items tagged

cezarovidiu

Gartner Positions Oracle in Leaders Quadrant for Master Data Management of Product Data... - 0 views

  • For the fourth consecutive year, Gartner, Inc. has named Oracle as a Leader in its “Magic Quadrant for Master Data Management of Product Data Solutions.” (1)
  • “MDM is a technology-enabled discipline in which business and IT staff work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official, shared master data assets. Master data is the consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise, such as customers, prospects, citizens, suppliers, sites, hierarchies and chart of accounts,” according to Gartner.
  • By enabling organizations to consolidate product information from heterogeneous systems, Oracle Product Hub creates a single view of product information that can be leveraged and shared across functional departments in the enterprise, as well as externally with trading partners.
  • ...1 more annotation...
  • "In any product company, accurate product information is a foundation for all major business initiatives, and this requires a robust, comprehensive and flexible product MDM solution,” said Jon Chorley, vice president, supply chain management product strategy, Oracle. "We believe Oracle's position in Gartner's Magic Quadrant for Master Data Management of Product Data Solutions highlights our ability to provide best-in-class functionality across the industry’s most complete MDM portfolio. By using Oracle MDM solutions, companies can obtain a high-quality, common enterprise product record and are better able to support their key business initiatives.”
  •  
    "Gartner Positions Oracle in Leaders Quadrant for Master Data Management of Product Data Solutions"
cezarovidiu

Big data: The next frontier for innovation, competition, and productivity | McKinsey & ... - 0 views

  • The amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus, according to research by MGI and McKinsey's Business Technology Office.
  • For example, a retailer using big data to the full could increase its operating margin by more than 60 percent.
  • important factor of production, alongside labor and capital.
  • ...9 more annotations...
  • five broad ways in which using big data can create value
  • Leading companies are using data collection and analysis to conduct controlled experiments to make better management decisions
  • others are using data for basic low-frequency forecasting to high-frequency nowcasting to adjust their business levers just in time.
  • big data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services.
  • Fourth, sophisticated analytics can substantially improve decision-making
  • big data can be used to improve the development of the next generation of products and services.
  • The use of big data will become a key basis of competition and growth for individual firms.
  • For example, we estimate that a retailer using big data to the full has the potential to increase its operating margin by more than 60 percent.
  • The computer and electronic products and information sectors, as well as finance and insurance, and government are poised to gain substantially from the use of big data.
cezarovidiu

Downloads - 0 views

  •  
    "Downloads These BIRT download options are for designing, deploying and viewing BIRT output. They include open source products licensed under the Eclipse Public License and 45-day trial versions of Actuate commercial products."
cezarovidiu

Analyzing Human Data: Take a Dive to Find Out What Your Customers Really Feel - Content... - 0 views

  • What really interests me, and what I think should interest marketers, is what I’ll call signals – one of which is intent. Intent is critical because it can predict action. For example, “Is this person shopping to buy a product like my product?” “Is this person unhappy and needing some form of attention?” “Is this person about to return the product for a reason that is addressable?”
  • Sentiment is one ingredient of intent. If someone is happy, sad, angry … that can be determined via sentiment analysis technologies.
  • Many tools struggle with context.
  • ...9 more annotations...
  • An example I hear over and over again is “thin” – good when you’re talking about electronics, but bad if you’re talking about hotel walls or the feel of hotel sheets. To do sentiment analysis correctly, you need refinement. You need customization for particular industries and business functions.
  • The market, unfortunately, is polluted with tools that claim to have sentiment abilities, but are too crude to be usable. Even with refinement (e.g., the ability to handle negators and contextual sentiment), approaches that deliver only positive and negative ratings don’t take you very far.
  • There are definitely easy, inexpensive entry points that can meet basic, just-getting-started needs: tools for social listening, survey analysis, customer service (handling contact-center notes, for instance), customer experience (via analysis of online reviews and forums), automated email processing, and other needs. These technologies are user friendly, available on demand, as a service.
  • Text mining:
  • Digital Reasoning, Luminoso and AlchemyAPI.
  • Image recognition and analysis: Image analysis now automatically identifies brand labels in pictures.
  • VisualGraph (now owned by Pinterest), Curalate, Piqora (nee Pinfluencer), and gazeMetrix.
  • Emotional analysis in images, audio, and video: These companies promote analysis of speech and facial expression primarily for structured studies
  • • Affectiva conducts webcam emotional analysis for media and ad research, including development tools to integrate emotional study in mobile apps. • Emotient performs emotional analyses in retail environments, evaluating signage, displays, and customer service. • EmoVu by Eyeris tests the engagement level of both short- and long-form video content. • Beyond Verbal studies emotion based on a person’s voice in real time.
cezarovidiu

PL/PDF generate and manipulate PDF with Oracle PL/SQL - 0 views

shared by cezarovidiu on 14 Feb 13 - Cached
  •  
    "Oracle Reporting & Document Generation PL/PDF is simply the easiest and most flexible way to create professional reports from your Oracle database. The data access is the fastest and safest, because our products work in the database. There is no need for extra servers and extra costs! We provide native PL/SQL solutions which is the best way to work with the Oracle data. All Oracle developer in the PL/SQL language know and use, so no need to learn a new programming language."
cezarovidiu

Enkitec : Products : Plug-ins : Plug-in Details - 0 views

  • Plug-in Details: CLOB Load Have you ever used a Rich Text Editor item type in APEX? Has the amount of text ever exceeded 32k? Choke! And so begins the search for a work around... While work arounds exists they can be a little difficult find and repeat each time you need them. We designed the Enkitec CLOB Load plug-in to simplify and standardize how large amounts of text are moved from the database to APEX page items and then back to the database.
cezarovidiu

The New Productivity Platforms: Your Solution To The AD&D Crunch | Forrester Blogs - 0 views

  • Most application development and delivery teams have simple marching orders: "Do more with less — and fast. And when you've done more with less, figure out how to do even more with still less on your next set of projects. And deliver even faster."
  • application development and delivery (AD&D)
cezarovidiu

Rittman Mead Consulting - The Changing World of Business Intelligence - 0 views

  • Schema on write This is the traditional approach for Business Intelligence. A model, often dimensional, is built as part of the design process. This model is an abstraction of the complexity of the underlying systems, put in business terms. The purpose of the model is to allow the business users to interrogate the data in a way they understand.
  • The model is instantiated through physical database tables and the date is loaded through an ETL (extract, transform and load) process that takes data from one or more source systems and transforms it to fit the model, then loads it into the model.
  • The key thing is that the model is determined before the data is finally written and the users are very much guided or driven by the model in how they query the data and what results they can get from the system. The designer must anticipate the queries and requests in advance of the user asking the questions.
  • ...3 more annotations...
  • Schema on read Schema on read works on a different principle and is more common in the Big Data world. The data is not transformed in any way when it is stored, the data store acts as a big bucket. The modelling of the data only occurs when the data is read. Map/Reduce is the clearest example, the mapping is the understanding of the data structure. Hadoop is a large distributed file system, which is very good at storing large volumes of data, this is potential. It is only the mapping of this data that provides value, this is done when the data is read, not written.
  • New World Order So whereas Business Intelligence used to always be driven by the model, the ETL process to populate the model and the reporting tool to query the model, there is now an approach where the data is collected its raw form, and advanced statistical or analytical tools are used to interrogate the data. An example of one such tool is R.
  • The driver for which approach to use is often driven by what the user wants to find out. If the question is clearly formed and the sources of data that are required to answer it well understood, for example how many units of a product have we sold, then the traditional schema on write approach is best.
cezarovidiu

What Skills Does an Oracle BI Developer Need in 2011? - 0 views

  • OBIEE 11g skills, both in terms of new functionality (mapping, analyses, KPIs and Scorecards etc) and new infrastructure (WebLogic, EM, OPSS etc) A smattering of Essbase skills, focused mainly on the integration with OBIEE and Essbase (and the many workarounds and gotchas) Good ODI skills, both in terms of the basics, but also being able to write knowledge modules, integrate with OBIEE, deployment and migration Solid database skills – OBIEE gave the illusion through aggregates etc that database tuning was redundant, but time has shown it’s by far the biggest success factor in a project – get the database design and optimisation wrong, and your project is toast. You need to know partitioning, materialized views, index types, and increasingly, you need to get yourself on an Exadata project as customers are buying the technology but you can’t teach it to yourself at home BI Apps skills, but watch out for everything changing when BI Apps 11g comes out, and be prepared to learn the Fusion Apps and JDeveloper if you want to stay in the game Looking to the future, keep an eye on technologies such as in-memory (TimesTen), mid-tier caching (Coherence), plus technologies such as Business Activity Monitoring (BAM), “big data” (Hadoop, large data sets, NoSQL), complex event processing and maybe products such as Qlikview, just in case Oracle buys them, or at least to know what the competition are up to, or more importantly pitching to your boss
  • The other thing to bear in mind of course, if you’re an Oracle BI developer, is that you need to have great business, communication and data modeling skills.
cezarovidiu

Visual Business Intelligence - Naked Statistics - 0 views

  • You can’t learn data visualization by memorizing a set of rules. You must understand why things work the way they do.
  • you must be able to think statistically
  • This doesn’t mean that you must learn advanced mathematics, nor can you do this work merely by learning how to use software to calculate correlation coefficients and p-values.
  • ...7 more annotations...
  • I am happy to announce that I’ve just found the book that does this better than any other that I’ve seen: Naked Statistics: Stripping the Dread from the Data, by Charles Wheelan (W. W. Norton & Company, 2013).
  • Wheelan teaches public policy and economics at Dartmouth College and is best known for a similar book written several years ago titled Naked Economics.
  • In Naked Statistics, he selects the most important and relevant statistical concepts that everyone should understand, especially those who work with data, and explains them in clear, entertaining, and practical terms.
  • He wrote this book specifically to help people think statistically. He shows how statistics can be used to improve our understanding of the world. He demonstrates that statistical concepts are easy to understand when they’re explained well.
  • If you read this book, you’ll come to understand statistical concepts and methods such as regression analysis and probability as never before.
  • Statistics is more important than ever before because we have more meaningful opportunities to make use of data. Yet the formulas will not tell us which uses of data are appropriate and which are not. Math cannot supplant judgment.
  • “Go forth and use data wisely and well!”
cezarovidiu

Filling a Critical Role in Business Today: The Data Translator - Microsoft Business Int... - 0 views

  • a lot of articles calling data scientists and statisticians the jobs of the future
  • there are more immediate needs that, when addressed, will have a much greater business impact.
  • Right now we have huge opportunities to make the data more accessible, more “joinable” and more consumable. Leaders don’t want more data – they want more information they can use to run their businesses.
  • ...5 more annotations...
  • Every company has hundreds of millions of records about their sales, expenses, employees and so on, with dozens of insights yet to be discovered through simple comparison or triangulation of relevant data.
  • Why don’t we focus on this? I think because it’s very difficult to do – being successful in this “data translator” role requires a unique set of skills and knowledge, the combination of which I call the BASE skillset: Business understanding Ability to synthetize and simplify Storytelling skills Expertise in data visualization
  • Business Understanding This one seems obvious, but it doesn’t mean simply understanding the financials of a business. Rather, it means truly knowing the operational details, the incentives, the install base, market growth, penetration, the competition, etc. An analyst can’t just know the technical aspect of a report or the math behind the numbers, but what is truly driving a pattern in terms of product quality, competition, incentives and/or offerings. The best analysts are able to mathematically isolate the key levers of a trend and then suggest actions to react to or take advantage of those trends. Ability to Synthetize and Simplify This is, in my opinion, the most underrated and underappreciated skill. Combing through thousands of data points and netting out 3-4 key issues in under 10 minutes, and then communicating these to a group of execs with very different analytical skills, is truly difficult. The key is to make it simple but not simplistic, which means you still capture the complexity even as you get to the few core insights. It requires a very thorough effort to gather all the relevant information before categorizing, prioritizing and deciding if it is significant. After a while, you become an expert and can sniff things out quickly. At the same time, there is the danger of missing anomalies when you jump to conclusions based only on a summary look.
  • Storytelling Skills There are stages that should be followed when explaining complex ideas, something data translators are frequently expected to do. The best storytellers start by giving context and trying to couple the current discussion to something the audience already knows, ensuring the story is well structured and connected. We have to move from a “buffet style” business review with thousands of numbers packed in tables to a layered approach that will guide the audience to focus first on the most relevant messages, diving deeper only when necessary. Minto Pyramid Principles, which are built around a process for organizing thought and communication, are helpful in making sure you really focus on what is important and relevant, versus being obsessed in telling every fact. Expertise in Data Visualization I am glad to finally see so much focus on Information Visualization and I believe this is correlated to the explosion of data. Traditional methods of organizing data do not facilitate an intuitive understanding of key information points or trends. For instance, the two examples below contain data on car sales across the U.S. The first, an alphabetized list, is much less intuitive than the second, which shows those sales on a map in Power View. With Power View, right away you can identify the states with the highest sales: CA, FL, TX, NY. (Workbook available here)
  • There is no better way to see patterns or trends than data visualization, making expertise in this area – both technical and analytical – critical for data translators.
cezarovidiu

16.4.2. Replication Compatibility Between MySQL Versions - 0 views

  • MySQL supports replication from one major version to the next higher major version. For example, you can replicate from a master running MySQL 4.1 to a slave running MySQL 5.0, from a master running MySQL 5.0 to a slave running MySQL 5.1, and so on.
  • However, one may encounter difficulties when replicating from an older master to a newer slave if the master uses statements or relies on behavior no longer supported in the version of MySQL used on the slave. For example, in MySQL 5.5, CREATE TABLE ... SELECT statements are permitted to change tables other than the one being created, but are no longer allowed to do so in MySQL 5.6 (see Section 16.4.1.4, “Replication of CREATE TABLE ... SELECT Statements”).
  • Important It is strongly recommended to use the most recent release available within a given MySQL major version because replication (and other) capabilities are continually being improved. It is also recommended to upgrade masters and slaves that use early releases of a major version of MySQL to GA (production) releases when the latter become available for that major version.
cezarovidiu

Moving Sugar to Another Server - SugarCRM Support Site - 0 views

    • cezarovidiu
       
      japtone   Senior Member Join Date Nov 2010 Posts 49  Re: Transferring SugarCRM to a new server If you're using Linux try to have the same version of PHP, Apache, and DB (MySQL for instance) in order to avoid compatibility issues. In your production server tar up the sugarcrm root directory, transfer it to the new server and untar wherever your new root directory will be.  Next take a db dump of your database, transfer it to the new server and do a restore. Make sure apache is configured on the new server to point to the root of sugarcrm and start it up.  Make sure to modify config.php to account for any change in paths and hostname.  that's what I've found to be the easiest way to 'clone' sugar.
  • mysqldump -h localhost -u [MySQL user, e.g. root] -p[database password] -c --add-drop-table --add-locks --all --quick --lock-tables [name of the database] > sqldump.sql
  • Extract the Database
  • ...5 more annotations...
  • Copy Filesystem Copy all your files to the new server.  This can be done simply by locating the root directory on your old instance and copy and pasting it to the new server location.
  • Import Database Import the mysql database into the new server.  Here's how you would restore your custback.sql file to the Customers database. mysql -u sadmin -p pass21 Customers < custback.sql Here's the general format you would follow: mysql -u [username] -p [password] [database_to_restore] < [backupfile]
  • Check Files and Permissions Check Config.php Open <sugarroot/config.php> and make sure that all settings still apply to the new server, such as: array ( 'db_host_name' => 'localhost', 'db_user_name' => 'root', 'db_password' => 'PASSWORD', 'db_name' => 'DATABASE_NAME', 'db_type' => 'mysql', ), 'site_url' =>, etc...
  • Check htaccess Open <sugarroot/.htaccess> and ensure that the new server URLs are used correctly.
  • Check Permissions Check that the permissions are correct on the new server. That is the entire custom and cache directories (and all the sub directories) in addition to the config.php file are owned and writable by the user that runs the application on the server.
cezarovidiu

FREE PDF Printer - 0 views

  • Support for Windows Terminal Server
cezarovidiu

Magic Quadrant for Business Intelligence and Analytics Platforms - 0 views

  • Integration BI infrastructure: All tools in the platform use the same security, metadata, administration, portal integration, object model and query engine, and should share the same look and feel. Metadata management: Tools should leverage the same metadata, and the tools should provide a robust way to search, capture, store, reuse and publish metadata objects, such as dimensions, hierarchies, measures, performance metrics and report layout objects. Development tools: The platform should provide a set of programmatic and visual tools, coupled with a software developer's kit for creating analytic applications, integrating them into a business process, and/or embedding them in another application. Collaboration: Enables users to share and discuss information and analytic content, and/or to manage hierarchies and metrics via discussion threads, chat and annotations.
  • Information Delivery Reporting: Provides the ability to create formatted and interactive reports, with or without parameters, with highly scalable distribution and scheduling capabilities. Dashboards: Includes the ability to publish Web-based or mobile reports with intuitive interactive displays that indicate the state of a performance metric compared with a goal or target value. Increasingly, dashboards are used to disseminate real-time data from operational applications, or in conjunction with a complex-event processing engine. Ad hoc query: Enables users to ask their own questions of the data, without relying on IT to create a report. In particular, the tools must have a robust semantic layer to enable users to navigate available data sources. Microsoft Office integration: Sometimes, Microsoft Office (particularly Excel) acts as the reporting or analytics client. In these cases, it is vital that the tool provides integration with Microsoft Office, including support for document and presentation formats, formulas, data "refreshes" and pivot tables. Advanced integration includes cell locking and write-back. Search-based BI: Applies a search index to structured and unstructured data sources and maps them into a classification structure of dimensions and measures that users can easily navigate and explore using a search interface. Mobile BI: Enables organizations to deliver analytic content to mobile devices in a publishing and/or interactive mode, and takes advantage of the mobile client's location awareness.
  • Analysis Online analytical processing (OLAP): Enables users to analyze data with fast query and calculation performance, enabling a style of analysis known as "slicing and dicing." Users are able to navigate multidimensional drill paths. They also have the ability to write back values to a proprietary database for planning and "what if" modeling purposes. This capability could span a variety of data architectures (such as relational or multidimensional) and storage architectures (such as disk-based or in-memory). Interactive visualization: Gives users the ability to display numerous aspects of the data more efficiently by using interactive pictures and charts, instead of rows and columns. Predictive modeling and data mining: Enables organizations to classify categorical variables, and to estimate continuous variables using mathematical algorithms. Scorecards: These take the metrics displayed in a dashboard a step further by applying them to a strategy map that aligns key performance indicators (KPIs) with a strategic objective. Prescriptive modeling, simulation and optimization: Supports decision making by enabling organizations to select the correct value of a variable based on a set of constraints for deterministic processes, and by modeling outcomes for stochastic processes.
  • ...7 more annotations...
  • These capabilities enable organizations to build precise systems of classification and measurement to support decision making and improve performance. BI and analytic platforms enable companies to measure and improve the metrics that matter most to their businesses, such as sales, profits, costs, quality defects, safety incidents, customer satisfaction, on-time delivery and so on. BI and analytic platforms also enable organizations to classify the dimensions of their businesses — such as their customers, products and employees — with more granular precision. With these capabilities, marketers can better understand which customers are most likely to churn. HR managers can better understand which attributes to look for when recruiting top performers. Supply chain managers can better understand which inventory allocation levels will keep costs low without increasing out-of-stock incidents.
  • descriptive, diagnostic, predictive and prescriptive analytics
  • "descriptive"
  • diagnostic
  • data discovery vendors — such as QlikTech, Salient Management Company, Tableau Software and Tibco Spotfire — received more positive feedback than vendors offering OLAP cube and semantic-layer-based architectures.
  • Microsoft Excel users are often disaffected business BI users who are unable to conduct the analysis they want using enterprise, IT-centric tools. Since these users are the typical target users of data discovery tool vendors, Microsoft's aggressive plans to enhance Excel will likely pose an additional competitive threat beyond the mainstreaming and integration of data discovery features as part of the other leading, IT-centric enterprise platforms.
  • Building on the in-memory capabilities of PowerPivot in SQL Server 2012, Microsoft introduced a fully in-memory version of Microsoft Analysis Services cubes, based on the same data structure as PowerPivot, to address the needs of organizations that are turning to newer in-memory OLAP architectures over traditional, multidimensional OLAP architectures to support dynamic and interactive analysis of large datasets. Above-average performance ratings suggest that customers are happy with the in-memory improvements in SQL Server 2012 compared with SQL Server 2008 R2, which ranks below the survey average.
  •  
    "Gartner defines the business intelligence (BI) and analytics platform market as a software platform that delivers 15 capabilities across three categories: integration, information delivery and analysis."
cezarovidiu

The Past, Present and Future of Business Intelligence. - YouTube - 0 views

shared by cezarovidiu on 13 Jan 13 - No Cached
  • Irshad Raihan interviews Don Lutter, senior BI solutions manager, on HP's products, solutions and services for Business Intelligence. Don has over 30 years of experience, building and implementing BI solutions. He talks about HP's view on where the market is headed and how HP can help customers address the challenges of Big Data and Real Time Analytics. This interview was recorded at HP Discover 2011 in Las Vegas.
1 - 20 of 34 Next ›
Showing 20 items per page