Skip to main content

Home/ BI-TAGS/ Group items tagged experience

Rss Feed Group items tagged

cezarovidiu

10 Reasons Why CEOs Don't Understand Their Customers - Forbes - 0 views

  • 1) Do bad customer experiences cause people to switch brands? In a 2011 research project conducted by CX application vendor RightNow, 89% of consumers said that yes, a bad experience has spurred them to switch brands. But in the brand-new study of business-executive perceptions that’s the subject of this column, only 49% of the surveyed executives said yes.  QUESTION: What steps do you need to take to close this dangerous perception gap? 2) While 97% of executives say CX is critical to the success of their company, and 91% say they’re committed to making their company a CX leader, only 20% would rate their own CX initiatives as “advanced,” with a dedicated CX leader in place, initial projects pushed to the optimization phase, and the overall project extended to new channels and groups . QUESTION: What are the obstacles preventing you from aligning your actions with your words? If you say it’s a “budget” issue, aren’t you really talking about strategic priorities rather than line items? 3) Most companies have a clear and direct understanding of the looming CX challenge and the powerful interaction of social media. The study found that the top two drivers for CX initiatives are (a) rising expectations from customers (59%),  and (b) the impact of social media on customers’ ability to broadcast good and bad experiences (37%). Now, even if you’re able to somehow rationalize those findings, here’s one that not even the most-accommodating executive can dismiss:
  • 4) Being a CX laggard can cost those companies many tens of millions or even hundreds of millions of dollars in lost revenue: executives estimated that the lack of positive, consistent, and brand-relevant customer experience can cause them to lose out on a staggering 20% in annual revenue.
  • Worse yet, all that money’s likely to wind up in the pockets of your competitors!
  • ...1 more annotation...
  • 5) While 81% of execs said they believe that social media is an essential ingredient in delivering great customer experiences, 35% of responding companies still do not have social media for sales channels, and another 35% still do not have social media for customer service. QUESTION: How do you plan to close that dangerous gap?
cezarovidiu

Business Intelligence Blog - The ElastiCube Chronicles - 0 views

  • SiSense’s survey finds that salaries for data professionals are on the rise across all geographies. The annual earnings of a data professional can range from an average of $55,000 USD for a data analyst to an average of $132,000 for VP Analytics. As many as 61% of the survey respondents reported higher earnings in 2012 compared to 2011, and only 12% reported lower earnings.
  • Other highlights of the survey findings include: Data professionals are highly educated. 85% of the respondents have some college degree, 39% have a Master’s degree, and 5% are Ph.D.’s. Those with doctoral degrees earn on average 65% more than those with Master’s degrees, who in turn earn 16% more than those with Bachelor’s degrees. On the job experience is even more important than education in determining salary levels. On average, professionals with ten or more years of experience earn 80% more than those with 3 years or less.
  • At the same time, the survey shows that those with 6 years or less make up as much as 59% of the data profession workforce.
  • ...1 more annotation...
  • Most Data Professionals work in teams of up to five people. “Companies are starting to realize that Data is key to their success. The majority of them, though are not growing their Data Science teams fast enough to win. This maybe because they don’t want to or because they can’t. This is an alarming trend though and only software can come to the rescue,” noted Aziza.
cezarovidiu

Obiee - Professional Experience,Email,Phone numbers..Everything! - 0 views

  •  
    yatedo
cezarovidiu

Big Data is a Solution Looking for a Problem: Gartner - CIO India News on | CIO.in - 0 views

  • Big Data is forecast to drive $34 billion of IT spending in 2013 and create 4.4 million IT jobs by 2015, but it is currently still a solution looking for a problem, according to analyst firm Gartner.
  • While businesses are keen to start mining their data stores for useful insights, and many are already experimenting with technologies like Hadoop, the biggest challenge is working out what question you are trying to answer
cezarovidiu

You Probably Need Parallel Except When You Don't - 0 views

  • f you are running a large Oracle data warehouse you should be using parallel
  • Like all tools you have to use parallel correctly; no more would we think of using a wrench to hammer a nail then should you think parallel is the answer to all performance problems. Sometimes parallel will make things worse, sometimes parallel will make performance less predictable.
  • Parallel introduces additional work to a query, simplistically we need to: split the query into multiple parallel processes, execute them, wait for the processes to complete and finally coordinate the results. This all takes time to do. Our time saving comes from being able to process multiple smaller chunks of data simultaneously. If the time to execute the step in parallel is not significantly faster than doing it without parallel then the additional overhead may make parallel processing a slower option; this is typically the case with small tables where a full tablescan or an indexed access is fast. Use too few parallel processes and we will not gain much in performance, too many and we risk starving the database of resource for other work or even slow our own process as it waits for resource. If you have implemented some form of CPU resource management on your system you may find that you experience delays as your parallel slaves ‘wait their turn’
cezarovidiu

Google Reader (250) - 0 views

  • What this means in practice is that when the BI Server component starts up, it creates and reserves a number of threads in advance, determined by a number of parameters including SERVER_THREAD_RANGE.
  • You can see these threads running and ready to perform tasks for the BI Server component by using a tool such as Process Explorer for Windows
  • Thinking it through a bit, any given single query is, to a certain extent, only really going to use a small part of the total amount of CPUs available on a server, because it’s not the BI Server that runs queries in parallel, it’s the underlying database. For example, a single analysis against a single Oracle Database datasource would only really need a single BI Server thread to handle the query request, but when the underlying database receives the query, it might use a large number of its CPUs to process the query, returning results back to the BI Server to then pass back to the Presentation Server for display to the user.
  • ...2 more annotations...
  • The BI Server wouldn’t have any use for any more query threads, as it can’t really do anything with them – the exception to this being queries that generate multiple physical SQLs, for example to join data from multiple sources together and return a single set of data to the user, for which the BI Server could benefit from a higher CPU count if each of these queries in turn led to lots of threads being used – but two queries, in themselves, don’t neccessarily require two CPUs, because of course the BI Server, and the underlying CPUs, are themselves multi-threaded.
  • To conclude then – all things begin equal, the BI Server should make use of all of the CPUs that the underlying operating system presents to it, with the OS itself deciding what threads are scheduled against which CPUs. In-theory, all CPUs on the server are available to each BI Server component, but each OS is different and it might be worth experimenting if you’re sure that certain CPUs aren’t being used – but this is most probably unlikely and the main reason you’d really consider vertical scale-out of BI Server components is for fault-tolerance, or if you’re using a 32-bit OS and each process can only see a subset of the total overall memory. And, bear in mind that however many CPUs the BI Server has available to it, for queries that send just a single SQL statement down to the underlying database server, adding more CPUs or faster CPUs isn’t going to help as only a single (or so) thread will be needed to send the query from the BI Server to the database, and it’s the database that’s doing all of the work – all that this would help with is compilation and post-aggregation work, and enabling the server to handle a higher number of concurrent users. Invest in a better underlying database instead, sort out your data model, and make sure your data source back-end is as optimised as possible.
cezarovidiu

The Past, Present and Future of Business Intelligence. - YouTube - 0 views

shared by cezarovidiu on 13 Jan 13 - No Cached
  • Irshad Raihan interviews Don Lutter, senior BI solutions manager, on HP's products, solutions and services for Business Intelligence. Don has over 30 years of experience, building and implementing BI solutions. He talks about HP's view on where the market is headed and how HP can help customers address the challenges of Big Data and Real Time Analytics. This interview was recorded at HP Discover 2011 in Las Vegas.
cezarovidiu

Why Soft Skills Matter in Data Science - 0 views

  • You cannot accept problems as handed to you in the business environment. Never allow yourself to be the analyst to whom problems are “thrown over the fence.” Engage with the people whose challenges you’re tackling to make sure you’re solving the right problem. Learn the business’s processes and the data that’s generated and saved. Learn how folks are handling the problem now, and what metrics they use (or ignore) to gauge success.
  • Solve the correct, yet often misrepresented, problem. This is something no mathematical model will ever say to you. No mathematical model can ever say, “Hey, good job formulating this optimization model, but I think you should take a step back and change your business a little instead.” And that leads me to my next point: Learn how to communicate.
  • In today’s business environment, it is often unacceptable to be skilled at only one thing. Data scientists are expected to be polyglots who understand math, code, and the plain-speak (or sports analogy-ridden speak . . . ugh) of business. And the only way to get good at speaking to other folks, just like the only way to get good at math, is through practice.
  • ...4 more annotations...
  • Beware the Three-Headed Geek-Monster: Tools, Performance, and Mathematical Perfection Many things can sabotage the use of analytics within the workplace. Politics and infighting perhaps; a bad experience from a previous “enterprise, business intelligence, cloud dashboard” project; or peers who don’t want their “dark art” optimized or automated for fear that their jobs will become redundant.
  • Not all hurdles are within your control as an analytics professional. But some are. There are three primary ways I see analytics folks sabotage their own work: overly complex modeling, tool obsession, and fixation on performance.
  • In other words, work with the rest of your organization to do better business, not to do data science for its own sake.
  • Data Smart: Using Data Science to Transform Information into Insight by John W. Foreman. Copyright © 2013.
cezarovidiu

http://www.oracle.com/ocom/groups/systemobject/@mktg_admin/documents/webcontent/videopl... - 0 views

  •  
    Oracle CX Management Investment - OOW2013
cezarovidiu

Download Microsoft Power Query for Excel - Office.com - 0 views

  • Microsoft Power Query is an Excel add-in that enhances the self-service Business Intelligence experience in Excel by simplifying data discovery and access.
  • Power Query enables users to easily discover, combine, and refine data for better analysis in Excel. Power Query includes a public search feature that is currently intended for use in the United States only.
cezarovidiu

Focus on Valuable Data - Not Big Data - to Boost Conversions and ROI | ClickZ - 0 views

  • Big Data has been all the rage. But fast data, even if it is small, can be more valuable than complicated masses of information.
  • Here's why: All the focus on "bigger is better" has overlooked the fact that most Big Data segments have not been validated with a business application or value.
  • Those kinds of analytics can help you find the right streams to access and work with, and also can help you build out robust programs that identify valuable customers.
  • ...1 more annotation...
  • 1) Your First-Party Data: The primary and most valuable data set you can access, first-party data encompasses transactional and other customer-level profile information you have on your customers. It could also include your own off-line segmentation analysis that allows you to map a customer to a customer profile around which you build your marketing programs. This can also include your analytics or other on-site tracking data, which can deliver behavioral insight to your consumers. This data can be difficult to export from its current environment due to the ad hoc nature of the data, but, if possible, look at ways to make this information accessible to your digital sites. 2) Third-Party Data: A consumer's broader Web browsing and buying history can now be accessed in session to provide you with more context on their likes and habits. Data management platforms (DMPs) and other data aggregators are accelerating this offering and, just as importantly, the availability of this type of data. This is invaluable in the context of new visitors who you know nothing about historically. 3) Real-Time Behavior: Let's not forget what our customers are telling us with each click. We get enamored with our predictive modeling to the point that we do not see the tell-tale signs as they are happening. Take the time to stop, look, and react. Your analytic tools, personalization tools, and other software-as-a-service (SaaS) platforms can help you trigger alternate site experiences based on every click you see.
cezarovidiu

Big data: The next frontier for innovation, competition, and productivity | McKinsey & ... - 0 views

  • The amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus, according to research by MGI and McKinsey's Business Technology Office.
  • For example, a retailer using big data to the full could increase its operating margin by more than 60 percent.
  • important factor of production, alongside labor and capital.
  • ...9 more annotations...
  • five broad ways in which using big data can create value
  • Leading companies are using data collection and analysis to conduct controlled experiments to make better management decisions
  • others are using data for basic low-frequency forecasting to high-frequency nowcasting to adjust their business levers just in time.
  • big data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services.
  • Fourth, sophisticated analytics can substantially improve decision-making
  • big data can be used to improve the development of the next generation of products and services.
  • The use of big data will become a key basis of competition and growth for individual firms.
  • For example, we estimate that a retailer using big data to the full has the potential to increase its operating margin by more than 60 percent.
  • The computer and electronic products and information sectors, as well as finance and insurance, and government are poised to gain substantially from the use of big data.
cezarovidiu

Analyzing Human Data: Take a Dive to Find Out What Your Customers Really Feel - Content... - 0 views

  • What really interests me, and what I think should interest marketers, is what I’ll call signals – one of which is intent. Intent is critical because it can predict action. For example, “Is this person shopping to buy a product like my product?” “Is this person unhappy and needing some form of attention?” “Is this person about to return the product for a reason that is addressable?”
  • Sentiment is one ingredient of intent. If someone is happy, sad, angry … that can be determined via sentiment analysis technologies.
  • Many tools struggle with context.
  • ...9 more annotations...
  • An example I hear over and over again is “thin” – good when you’re talking about electronics, but bad if you’re talking about hotel walls or the feel of hotel sheets. To do sentiment analysis correctly, you need refinement. You need customization for particular industries and business functions.
  • The market, unfortunately, is polluted with tools that claim to have sentiment abilities, but are too crude to be usable. Even with refinement (e.g., the ability to handle negators and contextual sentiment), approaches that deliver only positive and negative ratings don’t take you very far.
  • There are definitely easy, inexpensive entry points that can meet basic, just-getting-started needs: tools for social listening, survey analysis, customer service (handling contact-center notes, for instance), customer experience (via analysis of online reviews and forums), automated email processing, and other needs. These technologies are user friendly, available on demand, as a service.
  • Text mining:
  • Digital Reasoning, Luminoso and AlchemyAPI.
  • Image recognition and analysis: Image analysis now automatically identifies brand labels in pictures.
  • VisualGraph (now owned by Pinterest), Curalate, Piqora (nee Pinfluencer), and gazeMetrix.
  • Emotional analysis in images, audio, and video: These companies promote analysis of speech and facial expression primarily for structured studies
  • • Affectiva conducts webcam emotional analysis for media and ad research, including development tools to integrate emotional study in mobile apps. • Emotient performs emotional analyses in retail environments, evaluating signage, displays, and customer service. • EmoVu by Eyeris tests the engagement level of both short- and long-form video content. • Beyond Verbal studies emotion based on a person’s voice in real time.
1 - 16 of 16
Showing 20 items per page