Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

Tech News - Data

1209 Articles
article-image-how-to-analyze-salesforce-service-cloud-data-smarter-with-tableau-dashboard-starters-from-whats-new
Matthew Emerick
13 Oct 2020
5 min read
Save for later

How to analyze Salesforce Service Cloud data smarter with Tableau Dashboard Starters from What's New

Matthew Emerick
13 Oct 2020
5 min read
Boris Busov Solution Engineer Maddie Rawding Solution Engineer Tanna Solberg October 13, 2020 - 4:45pm October 13, 2020 The key to building a customer-focused organization is effective customer service. With every touchpoint, there are opportunities to increase operational efficiency and productivity, improve customer satisfaction, and build customer loyalty. High-performing service teams are 1.6 times more likely to use analytics to improve service. However, there are many pain points to getting started: there’s a wealth of data available coming from a variety of tools, traditional governance models prevent users from accessing data, and on top of everything it can be hard to find insights in complex data. The result is that customer service teams lack direction on how to improve and make their customers happy.  Every department in an organization should be able to understand their data—and customer service organizations are no exception—which is why we’re excited to add the Service Overview and the Case Tracking dashboards to our collection of starters. These two Dashboard Starters are specifically made for the Salesforce Service Cloud and are a great launching pad for anyone introducing analytics to their service organization. Salesforce puts customer experience at the center of every conversation, and now, you can use the power of Tableau’s new Dashboard Starters to discover insights and make data-driven decisions in your service organization.  Getting started with Service Cloud Dashboard Starters All of our Dashboard Starters are available on Tableau Online—simply create a new workbook and connect to Dashboard Starters when you’re building a workbook in Tableau Online (to learn how, follow the steps in this Help article). For Service Cloud, select and open the Service Overview and Case Tracking starters. If you don’t have Tableau Online, you can start a free trial. Alternatively, you can download the Dashboard starters from our website. We have a whole collection of Salesforce Dashboard Starters available for you to try. Service Overview Dashboard Starter Use the Service Overview dashboard to get a high-level rundown of your business across important metrics like CSAT, number of cases, response time, and SLA compliance. Select a metric at the top to filter all of the views on the dashboard and then drill into cases by selecting individual marks on a view. Figure 1: Monitor and drill into key performance metrics with the Service Overview dashboard. With the Service Overview dashboard you can come to a consensus on what good customer service looks like in your organization. Each metric has a customizable target on the dashboard that can be used to set benchmarks for your organization and alerts can be set on Tableau Online for users to get notified. Filter to see information for different time periods, geographies, and more. Figure 2: Set target goals to deliver great service.   Case Tracking Dashboard Starter The Case Tracking dashboard allows agents to monitor their case queue and performance over time. Filter the dashboard to an individual agent and then drill into trends over time to discover potential opportunities for improvement. Figure 3: Explore performance by agent and monitor trends with the Case Tracking dashboard.The Case Tracking dashboard also allows you to drill into case details. Add in your Salesforce URL (make sure the parameter is inputted correctly) and return to the dashboard. Use the arrow on the case details worksheet to jump directly into the case in Salesforce. Figure 4: Drill into case details and then head to Salesforce to take action.  Sharing and customizing the Dashboard Starters These Service Cloud starters are meant to be a starting point. The possibilities are limitless! You can do anything from: Publish your starters and then set alerts and subscriptions to share with your teams.  Add data and create visualizations from other data sources from important source systems to enrich your analysis. Create new KPIs, build custom calculations, and modify the starters to match how your organization provides service. Use custom colors to match your organization's branding. Plugging your own data into the Dashboard Starter These starters use sample data. If you want to add your own data you will need to connect to your Salesforce instance.  Select the Data Source tab. A dialog box will appear prompting you for your application credentials (i.e. Salesforce username and password). Enter your credentials and log in to your account. You’ll need to ensure your account has API access to your Salesforce instance with your Salesforce admin. Now, go back to the dashboard. Tableau Desktop will then create an extract of your data. This process will vary based on how much data you have in your Salesforce instance. If any worksheets appear blank, navigate to the blank worksheet. Replace reference fields by right-clicking on the fields with red exclamation marks as necessary. Making the most of the new Service Overview and Case Tracking Dashboard Starters can organize and analyze the wealth of data gained from every customer interaction. Being able to elevate insights empowers service teams to take action, resulting in lower call volumes, faster resolution times, and improved workflows. From the release of the Tableau Viz Lightning Web Component to the enhancements in Tableau’s connector to Salesforce, there’s never been a better time to start analyzing your data in Tableau and these Dashboard Starters are just the beginning of what is to come with Tableau and Salesforce. Additional resources: Connect to Salesforce data in Tableau Documentation on Dashboard Starters for cloud-based data Overview of Tableau Dashboard Starters Tableau Viz Lightning Web Component What is Salesforce Service Cloud? Tableau resources for customer service teams
Read more
  • 0
  • 0
  • 870

Matthew Emerick
13 Oct 2020
4 min read
Save for later

Happy Birthday, CDP Public Cloud from Cloudera Blog

Matthew Emerick
13 Oct 2020
4 min read
On September 24, 2019, Cloudera launched CDP Public Cloud (CDP-PC) as the first step in delivering the industry’s first Enterprise Data Cloud. That Was Then In the beginning, CDP ran only on AWS with a set of services that supported a handful of use cases and workload types: CDP Data Warehouse: a kubernetes-based service that allows business analysts to deploy data warehouses with secure, self-service access to enterprise data CDP Machine Learning: a kubernetes-based service that allows data scientists to deploy collaborative workspaces with secure, self-service access to enterprise data. CDP Data Hub:  a VM/Instance-based service that allows IT and developers to build custom business applications for a diverse set of use cases with secure, self-service access to enterprise data.  At the heart of CDP is SDX, a unified context layer for governance and security, that makes it easy to create a secure data lake and run workloads that address all stages of your data lifecycle (collect, enrich, report, serve and predict). This is Now With CDP-PC just a bit over a year old, we thought now would be a good time to reflect how far we have come since then.  Over the past year,  we’ve not only added Azure as a supported cloud platform, but we have improved the original services while growing the CDP-PC family significantly: Improved Services Data Warehouse – in addition to a number of performance optimizations, DW has added a number of new features for better scalability, monitoring and reliability to enable self-service access with security and performance  Machine Learning – has grown from a collaborative workbench to an end-to-end Production ML platform that enables data scientists to deploy a model or an application to production in minutes with production-level monitoring, governance and performance tracking. Data Hub – has expanded to support all stages of the data lifecycle: Collect – Flow Management (Apache NiFi), Streams Management (Apache Kafka) and Streaming Analytics (Apache Flink) Enrich – Data Engineering (Apache Spark and Apache Hive) Report – Data Engineering (Hive3), Data Mart (Apache Impala) and Real-Time Data Mart  (Apache Impala with Apache Kudu)  Serve – Operational Database (Apache HBASE), Data Exploration (Apache Solr)  Predict – Data Engineering (Apache Spark) New Services CDP Data Engineering (1) – a service purpose-built for data engineers focused on deploying and orchestrating data transformation using Spark at scale.  Behind the scenes, CDE leverages kubernetes to provide isolation and autoscaling as well as providing a comprehensive toolset to streamline ETL processes – including orchestration automation, pipeline monitoring and visual troubleshooting CDP Operational Database (2) – an autonomous, multimodal, autoscaling database environment supporting both NoSQL and SQL.  Under the covers, Operational Database leverages HBASE and allows end users to create databases without having to worry about infrastructure requirements  Data Visualization (3) – an insight and visualization tool, pre-integrated with Data Warehouse and Machine Learning, that simplifies sharing analytics and information among data teams    Replication Manager – makes it easy to copy or migrate unstructured (HDFS) or structured (Hive) data from on-premise clusters to CDP environments running in the Public Cloud  Workload Manager –  provides in-depth insights into workloads that can be used for troubleshooting failed jobs and optimizing slow workloads  Data Catalog – enables data stewards to organize and curate data assets globally, understand where relevant data is located, and audit how it is created, modified, secured and protected  Each of the above is integrated with SDX, ensuring a consistent mechanism for authentication, authorization, governance and management of data, regardless of where you access your data from and how you consume it.  Behind these new features is a support cast of many issues resolved, tweaks made and improvements added by a cast of hundreds of people to improve performance, scalability, reliability, usability and security of CDP Public Cloud. And We Are Not Done And that was just the first 12 months. Our roadmap includes a number of exciting new features and enhancements to build on our vision of helping you: Do Cloud Better: Deliver cloud-native analytics to the business in a secure, cost-efficient, and scalable manner. Enable Cloud Everywhere: Accelerate adoption of cloud-native data services for public clouds  Optimize the Data Lifecycle: Collect, enrich, report, serve, and model enterprise data for any business use case in any cloud. Learn More, Keep in Touch We invite you to learn more about CDP Public Cloud for yourself by watching a product demo  or by taking the platform for a test drive (it’s free to get started).  Keep up with what’s new in CDP-PC by following our monthly release summaries.  (1) Currently available on AWS only (2) Technical Preview on AWS and Azure (3)  Data Visualization is in Tech Preview on AWS and Azure The post Happy Birthday, CDP Public Cloud appeared first on Cloudera Blog.
Read more
  • 0
  • 0
  • 1199

Matthew Emerick
13 Oct 2020
1 min read
Save for later

On-premises data gateway October 2020 update is now available from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
13 Oct 2020
1 min read
October release of gateway
Read more
  • 0
  • 0
  • 937
Banner background image

article-image-interactive-notebook-style-analysis-in-tableau-for-data-science-extensibility-from-whats-new
Matthew Emerick
12 Oct 2020
4 min read
Save for later

Interactive, notebook-style analysis in Tableau for data science extensibility from What's New

Matthew Emerick
12 Oct 2020
4 min read
Tamas Foldi CTO of Starschema Inc., Tableau Zen Master, and White Hat Hacker Tanna Solberg October 12, 2020 - 8:40pm October 13, 2020 Tableau's intuitive drag and drop interface is one of the key reasons it has become the de facto standard for data visualization. With its easy-to-use interface, not only analysts, but everyone can see and understand their data. But let's look at who mean when we say "everyone." Does this include sophisticated users like data scientists or statisticians? In short, yes, but their workflow is slightly different; they rely heavily on statistical and machine learning algorithms, usually only accessible from R, Python, or Matlab. To interact with these libraries, statisticians and data scientists have to write code, experiment with their model parameters, and visualize the results. The usual tool of choice for data scientists is some notebook environment—such as RStudio or Jupyter—where they can mix their code and the visualizations. Figure 1: The traditional Jupyter notebook environmentIn the past, the number one reason for the lower adoption of Tableau for data scientists was the lack of support of this code-driven, iterative development methodology. However, with the Dashboard Extensions API and the Analytics Extensions API things have changed. The platform for everyone offers the best from code-driven data science and easy-to-use, drag-and-drop visualization worlds. Tableau Python and R Integration Analytical Extension presents the standard way to use Python, R, Matlab and other platforms' libraries and functions in Tableau workbooks. With standard SCRIPT Tableau functions, users can add their Python or R codes as Tableau calculated fields, opening up a whole new world in data enrichment and analysis.  Figure 2: A simple example of a Python calculated field in Tableau DesktopWhile it's a convenient way to use existing calculations, this is not the same iterative experience as a notebook. Here comes the Dashboard Extensions API to the rescue, providing the user experience to work with the code in a code editor—while seeing the results immediately as Tableau charts. CodePad editor for Tableau The Tableau Extension Gallery was recently updated with a new extension that allows interaction with your code—just like you would have in a notebook. As you change the code in the code editor, Tableau executes it and recalculates the marks and updates the visualization before your very eyes. Figure 3: Updating your viz with code editor extension CodePadTo use the extension, you need to create a string parameter and create a SCRIPT based calculated field with the relevant fields mapped as script parameters. Figure 4: Create a parameter to store the program codeFigure 5: Use parameter in SCRIPT functionThen add the extension to your dashboard, select the previously created parameter, and choose the same programming language configured to what you have in the Analytics Extension API: Figure 6: Add and configure the CodePad extension to your dashboardNow you can start building your views, adding to your machine learning models and use external APIs to enrich the data—all from the same platform. The best part is that you can reuse the same workbook to share the analysis with end users, which could potentially be placed on a different worksheet. Sample workbook analysis in Tableau To show some of real-life use cases, we put together an example workbook with three Python-based algorithms: Clustering - The clustering dashboard uses scikit learn’s DBSCAN algorithm to apply clustering to a set of points. Figure 7: Clustering using DBSCAN algorithm Seasonality Analysis – Use statsmodel’s seasonal_decompose to remove seasonality from time series data and show the pure trends. Sentiment Analysis – Compare the titles and ratings of product reviews with their sentiment scores.  Figure 8: Sentiment analysis using ntlk or textblob   Excited to try out this interactive, notebook-style analysis in Tableau? Download the demo workbook and add the extension to the dashboard. If you want to learn more about the Dashboard Extensions and Analytics Extensions API, you can join the Tableau Developer program for additional resources and community interaction.
Read more
  • 0
  • 0
  • 733

Matthew Emerick
12 Oct 2020
1 min read
Save for later

Data Source management on Power platform admin center from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
12 Oct 2020
1 min read
Data source management in admin center
Read more
  • 0
  • 0
  • 878

Matthew Emerick
12 Oct 2020
3 min read
Save for later

Cloudera acquires Eventador to accelerate Stream Processing in Public & Hybrid Clouds from Cloudera Blog

Matthew Emerick
12 Oct 2020
3 min read
We are thrilled to announce that Cloudera has acquired Eventador, a provider of cloud-native services for enterprise-grade stream processing. Eventador, based in Austin, TX, was founded by Erik Beebe and Kenny Gorman in 2016 to address a fundamental business problem – make it simpler to build streaming applications built on real-time data. This typically involved a lot of coding with Java, Scala or similar technologies. Eventador simplifies the process by allowing users to use SQL to query streams of real-time data without implementing complex code. We believe Eventador will accelerate innovation in our Cloudera DataFlow streaming platform and deliver more business value to our customers in their real-time analytics applications. The DataFlow platform has established a leading position in the data streaming market by unlocking the combined value and synergies of Apache NiFi, Apache Kafka and Apache Flink. We recently delivered all three of these streaming capabilities as cloud services through Cloudera Data Platform (CDP) Data Hub on AWS and Azure. We are especially proud to help grow Flink, the software, as well as the Flink community.  The next evolution of our data streaming platform is to deliver a seamless cloud-native DataFlow experience where users can focus on creating simple data pipelines that help ingest data from any streaming source, scale the data management with topics, and generate real-time insights by processing the data on the pipeline with an easy-to-use interface. Our primary design principles are self-service, simplicity and hybrid. And, like all CDP data management and analytic cloud services, DataFlow will offer a consistent user experience on public and private clouds – for real hybrid cloud data streaming.  The Eventador technology’s ability to simplify access to real-time data with SQL, and their expertise in managed service offerings will accelerate our DataFlow experience timelines and make DataFlow a richer streaming data platform that can address a broader range of business use cases.  With the addition of Eventador we can deliver more customer value for real-time analytics use cases including: Inventory optimization, predictive maintenance and a wide variety of IoT use cases for operations teams.  Personalized promotions and customer 360 use cases for sales and marketing teams. Risk management and real-time fraud analysis for IT and finance teams. To summarize, the addition of the Eventador technology and team to Cloudera will enable our customers to democratize cross-organizational access to real-time data. We encourage you to come with us on this journey as we continue to innovate the data streaming capabilities within the Cloudera Data Platform as part of the DataFlow experience. We are excited about what the future holds and we warmly welcome the Eventador team into Cloudera. Stay tuned for more product updates coming soon! The post Cloudera acquires Eventador to accelerate Stream Processing in Public & Hybrid Clouds appeared first on Cloudera Blog.
Read more
  • 0
  • 0
  • 1226
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
Matthew Emerick
08 Oct 2020
5 min read
Save for later

7 New Ways Cloudera Is Investing in Our Culture from Cloudera Blog

Matthew Emerick
08 Oct 2020
5 min read
As Cloudera offices around the world continue to cope with the impact of COVID-19, we have worked hard to ease stress and adapt to remote working. People are the heart of our company and we’re investing in creative, new ways to make every Clouderan feel valued and appreciated. Clouderans are superstars at work and at home, and burn-out is unhealthy for employees, their families, and the company. Our plan is to adapt the amazing workplace culture we have at Cloudera to our new remote workstyle.  Here are some of our recent initiatives geared toward supporting employees and reducing burn-out: We’re Pledging to Be Good Colleagues Toward the start of our work-from-home tenure, we developed a Cloudera WFH Code. It was designed to help us all rally around a common set of guidelines and rules that would help set the tone for WFH moving forward. We’re family first and people first. We’re Unplugging Starting at the beginning of July, we designated certain days as “Unplug days,” when employees are given the day (or in some cases multiple days) off to step away from work and do something to make their lives easier. That might mean pursuing a hobby, volunteering, spending time with family, or simply lying in bed and watching movies all day. With studies showing employees are working longer hours during this pandemic period, we need to make sure that Clouderans know that not only is it okay to unplug, we want them to.  To date, Clouderans took off 10 Unplugged days between July and September, with 22 more scheduled between now and Spring 2021. We’re making the most of our time. One of the more inspiring stories I heard was from one of our Singapore employees. Her office is participating in Mercy Relief’s Ground Zero virtual run challenge to raise money for local communities affected by natural disasters. Over our last Unplug weekend, the team covered 55 km collectively and garnered $3,000 to donate to the cause. We’re Taking Time off to Vote Cloudera pledged to #MakeTimeToVote, as part of the Time to Vote initiative. We are actively encouraging all employees around the world to take the time off needed to become informed voters and participate in their community elections. We’re Investing More in Diversity & Inclusion (D&I) Efforts Given the world climate around racism and injustice, and the spotlight on inclusion shortcomings in tech, we’ve doubled down on our commitment to D&I initiatives. Our new Chief Diversity Officer, Sarah Shin, is making speedy progress implementing these initiatives and getting them out to Clouderans. (And if you’re interested in fostering D&I within the workplace, we happen to have six new and open roles on her team.) One of Sarah’s first initiatives was implementing Bias Busters workshops for 379 managers. These sessions shared tools and best practices our managers can use to identify and interrupt unconscious biases. We also had the pleasure of meeting with Dr. Mary Frances Berry, acclaimed activist, writer, lawyer, and professor, to discuss diversity in tech and Cloudera’s role in leading the way toward a more inclusive workforce. Our Equality Committee had direct one-on-one time with Dr. Berry while our CEO, Rob Bearden, had a compelling discussion with her in our recent Cloudera Now virtual event as well as at a company Town Hall.  We’re Reinvigorating Our Creativity Whether it’s through the free virtual Medicine for the Soul Yoga memberships we’re providing, our new virtual cooking class program, or the meeting-free days we offer each week, we’re committed to helping Clouderans spark creativity. We each de-stress, find motivation, and thrive in different environments. The ability to choose is what’s most important. Plus, many of us will pick up a new hobby in the process. We’re Communicating This year, we launched a new weekly e-newsletter, Thriving Together, to keep our employees connected to the company and each other. Each issue features a Q&A with a Clouderan, highlights virtual events, links to employee (and world) news, shares work from home tips and articles and offers some levity for the workweek. My favorite section? Cool Things Clouderans Are Up To. I love learning about the creative ways our employees are connecting with each other while apart. We also launched a monthly manager newsletter. It keeps our leaders up-to-date on company initiatives and shares ways to support their team – and themselves – while we all learn how to navigate this 100% remote work world. Plus, we’re extraordinarily active on Slack. With channels for everything from dad jokes to pets to solo quarantining, we have something for everyone, and we’re seeing a high level of engagement across the board. We’re Volunteering While we traditionally have one Global Day of Service each year when employees have the day to volunteer and give back to their communities, this year we had three. The 2020 theme was Embracing Different Perspectives, and we provided Clouderans with multiple online opportunities to learn, volunteer, and give back. We were also able to participate in important conversations with leading nonprofits tackling the thorniest local and global issues.As we keep our fingers on the pulse of our company culture, we continue to roll out new initiatives to help meet our employees’ needs. We’ll continue to invest in emotional, mental, and physical well-being and keep our workplace culture at the forefront as we move forward, together. To learn more about our commitment to diversity and inclusion, take a look at our CEO Rob Bearden’s blog post on the topic and stay tuned to hear from our new CDO Sarah Shin later this month. The post 7 New Ways Cloudera Is Investing in Our Culture appeared first on Cloudera Blog.
Read more
  • 0
  • 0
  • 1757

Matthew Emerick
07 Oct 2020
1 min read
Save for later

Announcing improved MDX query performance in Power BI from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
07 Oct 2020
1 min read
With a recent update of the Analysis Services Tabular engine in Power BI, Multidimensional Expressions (MDX) clients, such as Microsoft Excel, can now enjoy improved query performance against datasets in Power BI.
Read more
  • 0
  • 0
  • 914

Matthew Emerick
28 Sep 2020
1 min read
Save for later

Join us! Power BI Webinar. Wednesday 30 September 2020 – 8:00 AM – 9:00 AM PDT from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
28 Sep 2020
1 min read
This webinar shows us shortcuts to help you unlock your new superpower on your usual context and save a lot of time when working with Power BI together with Excel. 
Read more
  • 0
  • 0
  • 724

Anonymous
24 Sep 2020
4 min read
Save for later

Data & insights for tracking the world’s most watched election from What's New

Anonymous
24 Sep 2020
4 min read
Data is how we understand elections. In the leadup to major political events, we all become data people: we track the latest candidate polling numbers, evaluate policy proposals, and look for statistics that could explain how the electorate might swing this year. This has never been more true than during this year’s U.S Presidential election. Quality data and clear, compelling visualizations are critical to understanding the polls, and how the sentiment among voters could influence the outcome of the race. Today, Tableau is announcing a new partnership with SurveyMonkey and Axios to bring exclusive public opinion research to life through rich visual analytics. Powered by SurveyMonkey’s vast polling infrastructure, Tableau’s world-class data visualization tools, and Axios’ incisive storytelling, this resource will enable anyone to delve into of-the-moment data and make discoveries. In the leadup to the election, SurveyMonkey will poll a randomly selected subset of the more than 2 million people who take a survey on their platform every day, asking a wide range of questions, from election integrity to COVID-19 concerns to how people will vote. “It’s never been clearer that what people think matters,” says Jon Cohen, chief research officer at SurveyMonkey. “With people around the world tuned into the U.S. presidential election, we’re showcasing how American voters are processing their choices, dealing with ongoing devastation from the pandemic, managing environmental crises, and confronting fresh challenges in their daily lives.” The results from SurveyMonkey’s ongoing polls will be published in interactive Tableau dashboards, where anyone will be able to filter and drill down into the data to explore how everything from demographics to geography to political affiliation play into people’s opinions. “To understand public views, we need to go beyond the topline numbers that dominate the conversation,” Cohen adds. “The critical debate over race and racial disparities and deeply partisan reaction to the country’s coronavirus response both point to the need to understand how different groups perceive what’s happening and what to do about it.” As a platform purpose-built for helping people to peel back layers of complex datasets and gain insights, Tableau provides visitors a compelling avenue into better understanding this year’s pre-election landscape. “People need reliable, well designed data visualizations that are easy to understand and can provide key insights,” says Andy Cotgreave, Tableau’s Director of Technical Evangelism. “As Americans make their decisions ahead of the election, they need charts that are optimized for communication. Tableau makes it possible to quickly and easily build the right chart for any data, and enable people to understand the data for themselves.” Alongside the dashboards, Tableau and SurveyMonkey experts will contribute pointers on visualization best practices for elections, and resources to enable anyone to better work with survey data. And Axios, as the exclusive media partner for this project, will incorporate the data and visualizations into their ongoing analysis of the political landscape around the election. At Tableau, we believe that data is a critical tool for understanding complex issues like political elections. We also believe that where data comes from—and how it's understood in context—is essential. Through our partnership with SurveyMonkey and Axios, we aim to provide visitors with an end-to-end experience of polling data, from understanding how SurveyMonkey’s polling infrastructure produces robust datasets, to seeing them visualized in Tableau, to watching them inform political commentary through Axios. Data doesn’t just answer questions—it prompts exploration and discussion. It helps us understand the complex issues shaping our jobs, families, and communities. Helping people see and understand survey data can bring clarity to important issues leading up to the election and lets people dig deeper to answer their own specific questions.
Read more
  • 0
  • 0
  • 577
Matthew Emerick
24 Sep 2020
1 min read
Save for later

See all the Power BI updates at the Microsoft Business Applications Launch Event from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
24 Sep 2020
1 min read
We’re excited to share all the new innovations we’re rolling out for Power BI to help make creating professional-grade apps even easier. Join us on October 1, 2020, from 9–11 AM Pacific Time (UTC -7), for this free digital event.
Read more
  • 0
  • 0
  • 758

Matthew Emerick
24 Sep 2020
1 min read
Save for later

On-premises data gateway September 2020 update is now available from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
24 Sep 2020
1 min read
September 2020 gateway release
Read more
  • 0
  • 0
  • 800

Matthew Emerick
23 Sep 2020
1 min read
Save for later

Announcing the Upcoming Evolution of Power BI Premium to enterprise markets and beyond from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
23 Sep 2020
1 min read
Yesterday was filled with announcements of new capabilities of Power BI Premium and even a per user licensing option to gain access to Premium features. This blog post sums those up so you can prepare for a much better experience of owning and using Power BI Premium.
Read more
  • 0
  • 0
  • 815
Matthew Emerick
23 Sep 2020
1 min read
Save for later

Answering your questions around the new Power BI Premium per user license from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
23 Sep 2020
1 min read
Today’s announcement by Arun around the introduction of a Premium per user license option has generated a lot of interest and excitement in the Power BI community. It has also generated a lot of questions, so we’ve put together this blog post to answer some of the most common ones we’ve seen.
Read more
  • 0
  • 0
  • 746

article-image-an-inside-look-the-world-food-programmes-data-driven-response-to-hunger-during-covid-19-from-whats-new
Matthew Emerick
23 Sep 2020
7 min read
Save for later

An inside look: The World Food Programme’s data-driven response to hunger during COVID-19 from What's New

Matthew Emerick
23 Sep 2020
7 min read
Neal Myrick Global Head of the Tableau Foundation Hannah Kuffner September 23, 2020 - 12:57am September 23, 2020 As COVID-19 has progressed, the need for organizations to be able to quickly access and analyze data has only increased. Tableau Foundation has worked with many of our partner organizations as they’ve navigated new challenges, from increased needs to constrained supply chains and limited resource-delivery options. The World Food Programme, one of our longtime partners, has made considerable investments over the past few years in data infrastructure. During COVID-19, those investments have proven critical in enabling them to respond to this global crisis. And during the pandemic, we made a new $1.6 million contribution to WFP so they could expand their data capacity and serve not just their own programs, but the entire humanitarian sector. WFP is the world’s largest hunger-focused humanitarian organization, and they knew immediately that the coronavirus would impact their work. “Our most recent analysis shows that a quarter of a billion people could very likely face severe hunger in 2020, due to job losses and loss of remittances in populations that are already vulnerable” says Enrica Porcari, WFP’s Chief Information Officer. Many of these communities depend on humanitarian support. With demand increasing so dramatically, WPF recognized there was a risk that existing programs might not be able to meet it. WFP, though, has found innovative solutions to step up its support for the entire humanitarian community to ensure communities have the support they need, even as the crisis evolves. A global response to a global pandemic What’s enabled WFP to keep two steps ahead of the pandemic is data. WFP operates in around 83 countries, and visibility into the conditions on the ground in each community they serve is essential. “When COVID-19 started, it was very difficult for us to have a sense of how big it was going to get,” says Pierre Guillaume Wielezynski, Digital Transformation Services Chief in WFP’s Technology department. But WFP’s investments in data and analytics infrastructure in recent years have set the organization up to remain responsive in times of uncertainty, when quick insights and decision-making are essential. “We’ve worked very closely with our colleagues across the organization to develop proven methods for tracking the impact of external events—from natural disasters to conflicts—on food security,” Wielezynski says. WFP was able to deploy that same data-driven methodology to rapidly assess the impacts of COVID-19. Rainfall and Markets data overlaid for Afghanistan in VAM’s DataViz A part of this data backbone is the Vulnerability Analysis and Monitoring (VAM) unit made up of a network of analysts around the world that collect up-to-date data on a variety of metrics that impact food security in a community. VAM oversees data spanning rainfall and vegetation conditions, conflict, hazards and even the world’s largest public market price database, to name a few. With over 100 active dashboards accessible through VAM’s DataViz, the network uses Tableau to communicate crucial information that can inform response. Powered by Tableau, VAM also compiles data on economic and social circumstances in a community that could affect food security, providing a snapshot of conditions on the ground. Staff can collate this data with near real-time operational updates from WFP’s robust supply chains, allowing them to anticipate disruptions resulting from the pandemic to ensure vulnerable families receive the support they need. WFP staff track hazards affecting food security using VAM’s DataViz powered by Tableau. Leading with data As the pandemic has progressed, this access to data has proven even more essential to the way WFP operates. “We’ve had to really rethink the way we do things if we can’t have boots on the ground in communities, and we’re facing huge challenges in the logistics of delivering resources to people,” Porcari says. WFP’s digital transformation over the last few years has brought data out of silos and made it broadly accessible across the organization to help make faster, better informed decisions. This has enabled WFP to remain nimble in how it responds to the demands of COVID-19. “In some cases, we’ve had to take a different approach to n how we assist people,” Wielezynski says. WFP serves some households by delivering essentials like corn, beans, wheat, and oil directly to them. But if the data is showing that the supply chain for these products has been disrupted by COVID-19, WFP can quickly switch to sending them direct cash assistance. Conversely, if a family typically receives cash but the pandemic has inflated local prices, WFP can send them food instead. For example, having access to this data has enabled WFP to rapidly expand its cash-based assistance in the Middle East, Central Asia and North Africa, where the total number of retailers distributing in-kind assistance almost doubled to 1,300 across the regions. The switch has enabled the organization to scale up support for vulnerable families on the receiving end of the political and economic fallout of the pandemic. “All of these decisions need to be based on very sound data to justify the switch, but also to design the intervention correctly,” Wielezynski says. “If the market prices are affected and we want to continue giving cash, how much do we give? And how can we continuously monitor these trends?” WFP’s near-real-time insights into the economic conditions that impact food insecurity has enabled them continue supporting families through this crisis. Building a data resource for the humanitarian sector Language about supply chains and purchasing power can sometimes obscure the fact that these decisions can be a matter of life-or-death for people. That urgency has driven WFP’s investments in digital transformation both in terms of the data infrastructure and data culture of the organization. Porcari says what has made responding to COVID-19 possible is not having to start from scratch. “We have had the good fortune and foresight to invest in digital transformation over the last few years, so we have foundational platforms, services and skills that are already very strong. We have strong partnerships that allow us to develop and deploy at speed and scale.” WFP’s investments in data have even enabled them to build a new resource to support the entire humanitarian sector. The Emergency Service Marketplace is a “one stop shop for the global humanitarian community to access WFP’s supply chain services,” Porcari says. WFP manages an extensive logistics and supply chain network, spanning air, water, and land transportation. During COVID-19, they’ve been able to mobilize this network to support the UN and other humanitarian agencies. “Now they can make a quick, online request to have health and humanitarian cargo shipped to the most vulnerable parts of the world,” Porcari says. “It’s like e-commerce for humanitarians.” A view of the shipments made via the Emergency Service Marketplace Rapidly developed in response to the rising demand for assistance brought on by the pandemic, the Marketplace has supported 51 organizations since June, delivering enough humanitarian and health cargo to 265 countries to fill 19 Olympic-sized swimming pools. Because many planes and other transport services were halted, WFP’s tracking system became essential, with Tableau visualizations making it easy for other humanitarian organizations, including WHO, to see where their shipments were in real time and ensure that they made it to their destination. “It’s a very simple and functional dashboard,” Wielezynski says. “An organization like the UNHCR could use it to request a delivery of masks from China to Mozambique, and then we could optimize the delivery method and show them where the shipment is in transit through the dashboard. The data allows them to recalibrate or reprioritize on the fly, whereas before, they might’ve had to place 20 phone calls only to receive week-old information.” Staff can track in near real-time the status of their shipments using Tableau. In a crisis when coordination and timeliness is essential, WFP’s data investments have enabled them to improve the entire aid delivery system for its sector. “This is what you might expect from an Alibaba or from an Amazon, not the humanitarian sector,” Porcari says. “But that’s what our vision is: If Amazon can do it, so can we. We owe it to our donors and the people we serve.”
Read more
  • 0
  • 0
  • 596