• 00:36

    Hadoop: How to Pick the Right Use Case

    in Technology

     


    Big data and Hadoop have received a lot of press lately. For all the hype, Hadoop does look like an attractive, low-cost solution that your company could leverage. Aside from limited knowledge on how the technology works, a primary reason organizations struggle to achieve value from big data is lack of a compelling business use case.


    The team at MetaScale has given this common roadblock plenty of consideration ? first while making big data decisions within Sears Holdings and now as a "big data accelerator" helping clients find the right answers. Tune in for this A2 Radio broadcast as Ankur Gupta, a big data director at Sears and head of sales, marketing, and operations for MetaScale, shares tips on:


     



    Making a case for Hadoop in your organization
    Enabling business analytics by speeding up data workloads
    Avoiding common missteps and roadblocks
    Developing a clearly defined, long-term big data strategy

  • 00:25

    If everything is about the data, doesn't it make sense to protect it?

    in Technology

    Learn of the increased focus placed on data security, compliance, and privacy to deal with the constant threats to enterprises, and how a leading industry actor effectively makes data protection possible in their growing diverse IT ecosystem. As company's environments become more open and complex, sensitive data can be found not only on internal databases, but also in datawarehouses, Big Data (Hadoop or NoSQL) platforms, and file systems including those outsourced and in the Cloud.  We will be discussing best practices approaches (like data activity monitoring and SIEM) to make data protection and regulation compliance a reality for the organization, while keeping costs down in an ever growing and changing IT deployment.

  • 00:35

    Big Data and OpenSource

    in Software

    The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-avaiability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-availabile service on top of a cluster of computers, each of which may be prone to failures. Cloudera's open-source Apache Hadoop distribution, CDH , targets enterprise-class deployments of that technology

  • 00:44

    7DEE / Cloud Analytics: Building Your Infrastructure for Big

    in Technology

    Learn how to lay down the technology foundation for your analytics transformation. In this session, we'll walk you through the relative advantages and disadvantages for midsized firms of public cloud, private cloud, HPC and clusters, scaling up vs. scaling out, placing data close to compute resources, and more. Find out how midsized firms can get the most from Hadoop, Flash storage, and other tools. We'll introduce you to the infrastructure requirements of big-data, and give you a solid idea of the infrastructure choices midmarket firms have for different types of big-data problems. 
     

Join Host Live Chats

There are no live chats in progress