Skip to main content

Home/ BI-TAGS/ Group items tagged examples

Rss Feed Group items tagged

cezarovidiu

lvm - Resizing logical volume on Oracle Enterprise Linux 5.6 - Server Fault - 0 views

  •  
    " 1. use fdisk create a new primary partition using the available open space. For this example it would be /dev/sda3. 2. pvcreate /dev/sda3 to set it up 3. vgextend /dev/VolGroup00 /dev/sda3 4. do a vgdisplay and see how many open extents you have on VolGroup00. For this example assume 407 extents are open 5. lvextend -l +407 /dev/VolGroup00/LogVol00 6. resize2fs /dev/VolGroup00/LogVol00 (assumming ext2/ext3)"
cezarovidiu

Saving Current Values with Cascading LOVs - 0 views

  •  
    "Saving Current Values with Cascading LOVs A friend, Monty Latiolais, recently asked an interesting question regarding cascading LOVs: Say you've got two LOVs...STATES and CITIES. They both default to 'ALL' and 'ALL'. Since CITIES is dependent on STATES, as soon as STATES is changed, CITIES is blanked out. What should happen is that CITIES gets re-evaluated as in the following example... let's say STATES is ALL and CITIES is "Houston". If one then changes STATES to "Texas", CITIES should remain "Houston" as that is a valid value for CITIES. So basically, is it possible to maintain the selected value of an item if that same value exists in the list of values after refreshing? That's a great question! Thanks to new events in the APEX framework and Dynamic Actions the solution is far easier than it would have been in the past! Click here to see the demo but continue reading to learn how it all works… There are a three main events you need to be concerned with when it comes to cascading selects: change apexbeforerefresh apexafterrefresh The change event is a standard part of JavaScript and the DOM. This event fires when the user manually changes the value of the select list but can also be triggered programmatically via JavaScript. The apexbeforerefresh and apexafterrefresh events are custom events in the APEX framework. They fire just before and just after AJAX requests refresh something on the page. The events work with many items and regions that utilize this technology. In this example we have two select lists: parent and child. If you change the value of the child select list then the change event will fire and that's it. But if you change the value of the parent select list a lot more happens to the child select. Here are some of the highlights: The current LOV values are cleared out The apexbeforerefresh event is triggered An AJAX request brings back new values. This only happens if optimize refresh is set to false optimize refresh is set to true and
cezarovidiu

Rittman Mead Consulting - The Changing World of Business Intelligence - 0 views

  • Schema on write This is the traditional approach for Business Intelligence. A model, often dimensional, is built as part of the design process. This model is an abstraction of the complexity of the underlying systems, put in business terms. The purpose of the model is to allow the business users to interrogate the data in a way they understand.
  • The model is instantiated through physical database tables and the date is loaded through an ETL (extract, transform and load) process that takes data from one or more source systems and transforms it to fit the model, then loads it into the model.
  • The key thing is that the model is determined before the data is finally written and the users are very much guided or driven by the model in how they query the data and what results they can get from the system. The designer must anticipate the queries and requests in advance of the user asking the questions.
  • ...3 more annotations...
  • Schema on read Schema on read works on a different principle and is more common in the Big Data world. The data is not transformed in any way when it is stored, the data store acts as a big bucket. The modelling of the data only occurs when the data is read. Map/Reduce is the clearest example, the mapping is the understanding of the data structure. Hadoop is a large distributed file system, which is very good at storing large volumes of data, this is potential. It is only the mapping of this data that provides value, this is done when the data is read, not written.
  • New World Order So whereas Business Intelligence used to always be driven by the model, the ETL process to populate the model and the reporting tool to query the model, there is now an approach where the data is collected its raw form, and advanced statistical or analytical tools are used to interrogate the data. An example of one such tool is R.
  • The driver for which approach to use is often driven by what the user wants to find out. If the question is clearly formed and the sources of data that are required to answer it well understood, for example how many units of a product have we sold, then the traditional schema on write approach is best.
cezarovidiu

EnablingUseOfApacheHtaccessFiles - Community Ubuntu Documentation - 0 views

  • Example Here is an example on how to prevent users from access the directory, password-protect a specific file and allow userse to view a specific file: AuthUserFile /your/path/.htpasswd AuthName "Authorization Required" AuthType Basic Order Allow,Deny <Files myfile1.html> Order Allow,Deny require valid-user </Files> <Files myfile2.html> Order Deny,Allow </Files>
  •  
    "Password-Protect a Directory With .htaccess"
cezarovidiu

Top Mistakes to Avoid in Analytics Implementations | StatSlice Business Intelligence an... - 0 views

  • Mistake 1.  Not putting a strong interdisciplinary team together. It is impossible to put together an analytics platform without understanding the needs of the customers who will use it.  Sounds simple, right?  Who wouldn’t do that?  You’d be surprised how many analytics projects are wrapped up by IT because “they think” they know the customer needs.  Not assembling the right team is clearly the biggest mistake companies make.  Many times what is on your mind (and if you’re an IT person willing to admit it) is that you are considering converting all those favorite company reports.  Your goal should not be that.  Your goal is to create a system—human engineered with customers, financial people, IT folks, analysts, and others—that give people new and exciting ways to look at information.  It should give you new insights. New competitive information.  If you don’t get the right team put together, you’ll find someone longing for the good old days and their old dusty reports.  Or worse yet, still finding ways to generate those old dusty reports. Mistake 2.  Not having the right talent to design, build, run and update your analytics system.  It is undeniable that there is now high demand for business analytics specialists.  There are not a lot of them out there that really know what to do unless they’ve been burned a few times and have survived and then built successful BA systems.  This is reflected by the fact you see so many analytics vendors offer, or often recommend, third-party consulting and training to help the organization develop their business analytic skills.  Work hard to build a three-way partnership between the vendor, your own team, and an implementation partner.  If you develop those relationships, risk of failure goes way down.
  • Mistake 3.  Putting the wrong kind of analyst or designer on the project. This is somewhat related to Mistake 2 but with some subtle differences.  People have different skillsets so you need to make sure the person you’re considering to put on the project is the right “kind.”  For example, when you put the design together you need both drill-down and summary models.  Both have different types of users.  Does this person know how to do both?  Or, for example, inexperience in an analyst might lead to them believing vendor claims and not be able to verify them as to functionality or time to implement. Mistake 4.  Not understanding how clean the data is you are getting and the time frame to get it clean.  Profile your data to understand the quality of your source data.  This will allow you to adjust your system accordingly to compensate for some of those issues or more importantly push data fixes to your source systems.  Ensure high quality data or your risk upsetting your customers.  If you don’t have a good understanding of the quality of your data, you could easily find yourself way behind schedule even though the actual analytics and business intelligence framework you are building is coming along fine. Mistake 5.  Picking the wrong tools.  How often do organizations buy software tools that just sit on the shelve?  This often comes from management rushing into a quick decision based on a few demos they have seen.  Picking the right analytics tools requires an in-depth understanding of your requirements as well as the strengths and weaknesses of the tools you are evaluating.  The best way to achieve this understanding is by getting an unbiased implementation partner to build a proof of concept with a subset of your own data and prove out the functionality of the tools you are considering. Bottom Line.  Think things through carefully. Make sure you put the right team together.  Have a data cleansing plan.  If the hype sounds too good to be true—have someone prove it to you.
cezarovidiu

Google Reader (250) - 0 views

  • What this means in practice is that when the BI Server component starts up, it creates and reserves a number of threads in advance, determined by a number of parameters including SERVER_THREAD_RANGE.
  • You can see these threads running and ready to perform tasks for the BI Server component by using a tool such as Process Explorer for Windows
  • Thinking it through a bit, any given single query is, to a certain extent, only really going to use a small part of the total amount of CPUs available on a server, because it’s not the BI Server that runs queries in parallel, it’s the underlying database. For example, a single analysis against a single Oracle Database datasource would only really need a single BI Server thread to handle the query request, but when the underlying database receives the query, it might use a large number of its CPUs to process the query, returning results back to the BI Server to then pass back to the Presentation Server for display to the user.
  • ...2 more annotations...
  • The BI Server wouldn’t have any use for any more query threads, as it can’t really do anything with them – the exception to this being queries that generate multiple physical SQLs, for example to join data from multiple sources together and return a single set of data to the user, for which the BI Server could benefit from a higher CPU count if each of these queries in turn led to lots of threads being used – but two queries, in themselves, don’t neccessarily require two CPUs, because of course the BI Server, and the underlying CPUs, are themselves multi-threaded.
  • To conclude then – all things begin equal, the BI Server should make use of all of the CPUs that the underlying operating system presents to it, with the OS itself deciding what threads are scheduled against which CPUs. In-theory, all CPUs on the server are available to each BI Server component, but each OS is different and it might be worth experimenting if you’re sure that certain CPUs aren’t being used – but this is most probably unlikely and the main reason you’d really consider vertical scale-out of BI Server components is for fault-tolerance, or if you’re using a 32-bit OS and each process can only see a subset of the total overall memory. And, bear in mind that however many CPUs the BI Server has available to it, for queries that send just a single SQL statement down to the underlying database server, adding more CPUs or faster CPUs isn’t going to help as only a single (or so) thread will be needed to send the query from the BI Server to the database, and it’s the database that’s doing all of the work – all that this would help with is compilation and post-aggregation work, and enabling the server to handle a higher number of concurrent users. Invest in a better underlying database instead, sort out your data model, and make sure your data source back-end is as optimised as possible.
cezarovidiu

16.4.2. Replication Compatibility Between MySQL Versions - 0 views

  • MySQL supports replication from one major version to the next higher major version. For example, you can replicate from a master running MySQL 4.1 to a slave running MySQL 5.0, from a master running MySQL 5.0 to a slave running MySQL 5.1, and so on.
  • However, one may encounter difficulties when replicating from an older master to a newer slave if the master uses statements or relies on behavior no longer supported in the version of MySQL used on the slave. For example, in MySQL 5.5, CREATE TABLE ... SELECT statements are permitted to change tables other than the one being created, but are no longer allowed to do so in MySQL 5.6 (see Section 16.4.1.4, “Replication of CREATE TABLE ... SELECT Statements”).
  • Important It is strongly recommended to use the most recent release available within a given MySQL major version because replication (and other) capabilities are continually being improved. It is also recommended to upgrade masters and slaves that use early releases of a major version of MySQL to GA (production) releases when the latter become available for that major version.
cezarovidiu

Convert VirtualBox (vdi) hard drive image to VMWare (vmdk) format » MikeBeach... - 0 views

  • Example (Windows):

    1"c:Program FilesOracleVirtualBoxVBoxManage.exe" clonehd "Win XP.vdi" xp.vmdk  --format vmdk --variant standard
  • Example (Linux):

    1VBoxManage clonehd "Win XP.vdi" xp.vmdk  --format vmdk --variant standard
  • Next, open VMWare and select Create a new virtual machine Select “I will install the operating system later” Make your OS selection about the OS that’s currently on the vmdk you will be using. (The guest OS, not the host OS). Later on, you will have the option to use an existing vmdk image as your virtual hard drive. Do so. You should now be able to finish setup and boot your converted disk image.
  • ...1 more annotation...
  • Absolute path to VBoxManage is necessary unless it’s in the Windows $PATH.
cezarovidiu

[Tutorial] VLOOKUP questions and answers (View topic) * OpenOffice.org Community Forum - 0 views

  • Summary: Check Search whole cells and uncheck Regular expressions
  •  
    "Very important: Two of the options (OpenOffice.org > Preferences on a Mac, Tools > Options on other platforms) affect several functions, including VLOOKUP. Both of these options are in the Calc > Calculate section: Search criteria = and <> must apply to whole cells - If you uncheck this option text searches in VLOOKUP can match a substring of the values in the table so in the example a search for B will find B+. You almost certainly want to enable this option so that an exact match must occur. Enabling the option also makes your VLOOKUP formulas compatible with Excel. Enable regular expressions in formulas - Unless you understand what "regular expressions" are (see Help) and unless you specifically want to use them in your spreadsheet, you will want to uncheck Enable regular expressions in formulas because this option can make VLOOKUP difficult to use. Unchecking the option also makes your VLOOKUP formulas compatible with Excel. The questions below address what happens if you enable this option. Summary: Check Search whole cells and uncheck Regular expressions"
cezarovidiu

Using the JDBC Connectivity Layer in Oracle Warehouse Builder - 0 views

  • For example, suppose you want to add support for MySQL. (As of OWB 11g R2, MySQL is not on the list of supported by default platforms.) All you need to do, though, is download the MySQL JDBC driver to put it into the OWB_HOME/owb/lib/ext directory, and add the platform definition for MySQL via a Tcl script that you can run from the OMB Plus console. The contents of such a script is beyond the scope of this article. However, if you want to look at one, check out this post by David Allan, where you’ll find a detailed example of how you can add support for MySQL to Oracle Warehouse Builder 11g Release 2. Also, there is a whitepaper on OTN called the "OWB Platform and Application Adapter Extensibility Cookbook", which goes into more depth than David’s post.
cezarovidiu

Big data: The next frontier for innovation, competition, and productivity | McKinsey & ... - 0 views

  • The amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus, according to research by MGI and McKinsey's Business Technology Office.
  • For example, a retailer using big data to the full could increase its operating margin by more than 60 percent.
  • important factor of production, alongside labor and capital.
  • ...9 more annotations...
  • five broad ways in which using big data can create value
  • Leading companies are using data collection and analysis to conduct controlled experiments to make better management decisions
  • others are using data for basic low-frequency forecasting to high-frequency nowcasting to adjust their business levers just in time.
  • big data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services.
  • Fourth, sophisticated analytics can substantially improve decision-making
  • big data can be used to improve the development of the next generation of products and services.
  • The use of big data will become a key basis of competition and growth for individual firms.
  • For example, we estimate that a retailer using big data to the full has the potential to increase its operating margin by more than 60 percent.
  • The computer and electronic products and information sectors, as well as finance and insurance, and government are poised to gain substantially from the use of big data.
cezarovidiu

Analyzing Human Data: Take a Dive to Find Out What Your Customers Really Feel - Content... - 0 views

  • What really interests me, and what I think should interest marketers, is what I’ll call signals – one of which is intent. Intent is critical because it can predict action. For example, “Is this person shopping to buy a product like my product?” “Is this person unhappy and needing some form of attention?” “Is this person about to return the product for a reason that is addressable?”
  • Sentiment is one ingredient of intent. If someone is happy, sad, angry … that can be determined via sentiment analysis technologies.
  • Many tools struggle with context.
  • ...9 more annotations...
  • An example I hear over and over again is “thin” – good when you’re talking about electronics, but bad if you’re talking about hotel walls or the feel of hotel sheets. To do sentiment analysis correctly, you need refinement. You need customization for particular industries and business functions.
  • The market, unfortunately, is polluted with tools that claim to have sentiment abilities, but are too crude to be usable. Even with refinement (e.g., the ability to handle negators and contextual sentiment), approaches that deliver only positive and negative ratings don’t take you very far.
  • There are definitely easy, inexpensive entry points that can meet basic, just-getting-started needs: tools for social listening, survey analysis, customer service (handling contact-center notes, for instance), customer experience (via analysis of online reviews and forums), automated email processing, and other needs. These technologies are user friendly, available on demand, as a service.
  • Text mining:
  • Digital Reasoning, Luminoso and AlchemyAPI.
  • Image recognition and analysis: Image analysis now automatically identifies brand labels in pictures.
  • VisualGraph (now owned by Pinterest), Curalate, Piqora (nee Pinfluencer), and gazeMetrix.
  • Emotional analysis in images, audio, and video: These companies promote analysis of speech and facial expression primarily for structured studies
  • • Affectiva conducts webcam emotional analysis for media and ad research, including development tools to integrate emotional study in mobile apps. • Emotient performs emotional analyses in retail environments, evaluating signage, displays, and customer service. • EmoVu by Eyeris tests the engagement level of both short- and long-form video content. • Beyond Verbal studies emotion based on a person’s voice in real time.
cezarovidiu

Filling a Critical Role in Business Today: The Data Translator - Microsoft Business Int... - 0 views

  • a lot of articles calling data scientists and statisticians the jobs of the future
  • there are more immediate needs that, when addressed, will have a much greater business impact.
  • Right now we have huge opportunities to make the data more accessible, more “joinable” and more consumable. Leaders don’t want more data – they want more information they can use to run their businesses.
  • ...5 more annotations...
  • Every company has hundreds of millions of records about their sales, expenses, employees and so on, with dozens of insights yet to be discovered through simple comparison or triangulation of relevant data.
  • Why don’t we focus on this? I think because it’s very difficult to do – being successful in this “data translator” role requires a unique set of skills and knowledge, the combination of which I call the BASE skillset: Business understanding Ability to synthetize and simplify Storytelling skills Expertise in data visualization
  • Business Understanding This one seems obvious, but it doesn’t mean simply understanding the financials of a business. Rather, it means truly knowing the operational details, the incentives, the install base, market growth, penetration, the competition, etc. An analyst can’t just know the technical aspect of a report or the math behind the numbers, but what is truly driving a pattern in terms of product quality, competition, incentives and/or offerings. The best analysts are able to mathematically isolate the key levers of a trend and then suggest actions to react to or take advantage of those trends. Ability to Synthetize and Simplify This is, in my opinion, the most underrated and underappreciated skill. Combing through thousands of data points and netting out 3-4 key issues in under 10 minutes, and then communicating these to a group of execs with very different analytical skills, is truly difficult. The key is to make it simple but not simplistic, which means you still capture the complexity even as you get to the few core insights. It requires a very thorough effort to gather all the relevant information before categorizing, prioritizing and deciding if it is significant. After a while, you become an expert and can sniff things out quickly. At the same time, there is the danger of missing anomalies when you jump to conclusions based only on a summary look.
  • Storytelling Skills There are stages that should be followed when explaining complex ideas, something data translators are frequently expected to do. The best storytellers start by giving context and trying to couple the current discussion to something the audience already knows, ensuring the story is well structured and connected. We have to move from a “buffet style” business review with thousands of numbers packed in tables to a layered approach that will guide the audience to focus first on the most relevant messages, diving deeper only when necessary. Minto Pyramid Principles, which are built around a process for organizing thought and communication, are helpful in making sure you really focus on what is important and relevant, versus being obsessed in telling every fact. Expertise in Data Visualization I am glad to finally see so much focus on Information Visualization and I believe this is correlated to the explosion of data. Traditional methods of organizing data do not facilitate an intuitive understanding of key information points or trends. For instance, the two examples below contain data on car sales across the U.S. The first, an alphabetized list, is much less intuitive than the second, which shows those sales on a map in Power View. With Power View, right away you can identify the states with the highest sales: CA, FL, TX, NY. (Workbook available here)
  • There is no better way to see patterns or trends than data visualization, making expertise in this area – both technical and analytical – critical for data translators.
cezarovidiu

Static IP Address Assignment - 0 views

  • Static IP Address Assignment To configure your system to use a static IP address assignment, add the static method to the inet address family statement for the appropriate interface in the
  • file /etc/network/interfaces.
  • The example below assumes you are configuring your first Ethernet interface identified as eth0. Change the address, netmask, and gateway values to meet the requirements of your network. auto eth0 iface eth0 inet static address 10.0.0.100 netmask 255.255.255.0 gateway 10.0.0.1
cezarovidiu

Dancing and Wrestling with Oracle APEX: Apex and FusionCharts (or There be dragons at t... - 0 views

  • All of which led me to FusionCharts, which is a brilliant set of flash charts and widgets.
  • All I had to do was figure out how to integrate it into my app. First I had to write a function to extract the data I needed from my database and output it as correctly-formatted XML. That bit was easy so I won't bore you with it.
  • Next I uploaded the Flash (SWF) file for my chart into my workspace. (Tell me something: when you upload an image to your application using Apex's image uploader you refer to it by pointing at # APP_IMAGES#, so how do you think you'd refer to a file you've uploaded using Apex's file uploader? #APP_FILES#? Wrong! Illogically, all files uploaded into your application should be pointed at using the #APP_IMAGES# substitution string.)Finally, I created a dynamic PL/SQL content region outputting the necessary wrapper tags for my Flash movie (which I copied from the FusionCharts examples), pointing it to my uploaded swf file and feeding it the XML from my database function (which I call in "before regions" page process).
cezarovidiu

8 Principles That Can Make You an Analytics Rock Star -- TDWI -The Data Warehousing Ins... - 0 views

  • Great design, high-quality code, strong business sponsorship, accurate requirements, good project management, and thorough testing are some of the obvious requirements for successful analytics systems.
  • As a professional in the field, you must be able to do these things well because they form the foundation of a good analytics implementation.
  • Successful analytics professionals should follow a set of guiding principles.
  • ...11 more annotations...
  • Principle #1: Let your passion bloom
  • If you do not love data analytics, it will be hard to become an analytics rock star. No significant accomplishments are achieved without passion. For many people, passion does not come naturally; it must be developed. Cultivate passion by setting goals and achieving them. Realize that the best opportunity in your life is the one in front of you right now. Focus on it, grow it, and develop your passion for it! That excitement will become obvious to those around you.
  • Principle #2: Never stop learning
  • Dig down deeper about the business details of your company. What, exactly, does your company do? What are some of its challenges and opportunities? How would the company benefit from valuable and transformative information you can deliver? Take the time necessary to learn the skills that are valuable for your business and your career. Keep up-to-date with the latest technologies and available analytics tools -- learn and understand their capabilities, functions, and differences.
  • Deepen your knowledge with the tools that you are currently working on by picking new techniques and methodologies that make you a better professional in the field.
  • Principle #3: Improve your presentation skills and become an ambassador for analytics
  • persuasiveness and effectiveness
  • Improve your presentation and speaking skills, even if it is on your own time. Excellent and no-cost presentation training resources are readily available on the internet (for example, at http://www.mindtools.com/page8.html. Practice writing and giving presentations to friends and colleagues that will give you honest feedback. Once you have practiced the basic skills, you need to enhance your skills by improving your
  • You must be able to explain, justify, and "sell" your ideas to colleagues as well as business management. Organizational change does not happen overnight or as a result of one presentation. You need to be persistent and skillful in taking your ideas all the way up the leadership chain.
  • Principle #4: Be the "go-to guy" for tough analytics questions
  • Tough analytics problems typically don't have an obvious answer -- that's why they're tough! Take the initiative by digging deep into those problems without being asked. Throw out all the assumptions made so far and follow logical trial and error methodology. First, develop a thesis about possible contributors to the problem at hand. Second, run the analytics to prove the thesis. Learn from that outcome and start over, if needed, until a significant answer is found. You are now well on your way to rock star status.
cezarovidiu

Connecting Infobright and Talend - 1 views

  • These instructions assume that you have Infobright installed and running. &nbsp; First and foremost, download Talend.&nbsp; In this example, we will download Talend Open Source Data Integrator v5.0. (http://www.talend.com/download.php)&nbsp; Once fully installed, download the Talend/Infobright Connector.&nbsp; Ensure you download the right connector; instructions are on the download page (http://www.infobright.org/Downloads/Contributed-Software/) If you download Talend 4.0+, you’ll want the latest connector For older versions of Talend, you’ll want the 3.7 connector and lower. Once downloaded, perform the following actions: [For Windows] Copy the infobright_jni([_32|_64])bit.dll to C:\Windows\infobright_jni.dll Copy the zipped “tInfobrightOutput” directory to this directory: [Install Root of Talend] \plugins\org.talend.designer.components.localprovider_5.0.1.r74687\components\tInfobrightOutput Copy “infobright-core-3.4.jar” to [Install Root of Talend]\lib\java Running Talend in Windows If using Windows, run talend as Administrator.&nbsp; If you don’t, you will see odd “Access Denied” or “Accesse Refuse” error messages when trying to use the connector.
  • You need to do some work on these instructions. Version 5 is not like version 4. You must run Talend 5 before the “lib\java\” folder appears.&nbsp; Once it does appear, it no longer contains the .jar files like version 4; just a file “index.xml” that you have to edit to point to the infobright jar file in the components folder.
1 - 20 of 30 Next ›
Showing 20 items per page