Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - Data

1209 Articles
article-image-the-pace-of-innovation-never-rests-how-lessons-from-our-past-still-influence-us-today-from-whats-new
Anonymous
03 Dec 2020
7 min read
Save for later

The pace of innovation never rests: How lessons from our past still influence us today from What's New

Anonymous
03 Dec 2020
7 min read
Andrew Beers Chief Technology Officer, Tableau Kristin Adderson December 3, 2020 - 3:27pm December 3, 2020 Everyone is talking about the need for innovation these days, but there are a lot of questions about the best ways to move forward. Even before the Covid-19 crisis hit, McKinsey found that 92 percent of company leaders thought their business models wouldn’t stay viable at the then-current rates of digitization, and the pandemic has only accelerated this need for rapid innovation in the digital world.    As we’ve helped several customers navigate the uncertainty and find solutions, we always go back to what’s at the core of innovation at Tableau. A recent event was the perfect opportunity to pause and look at how we’ve also weathered uncertainty and increased our pace of innovation throughout the history of Tableau—and how these lessons still serve us today.   The IEEE VIS conference is the premier forum for academic and applied research in visualization, bringing together an international community to share ideas and celebrate innovation every year. It also hands out the Test of Time Awards honoring work that has endured and remained relevant for at least a decade or longer after its initial publication. This year, Tableau co-founders Chris Stolte and Pat Hanrahan, with their former colleague Diane Tang, received the 20-year Test of Time Award for their groundbreaking research underlying Tableau, a paper titled Polaris: a system for query, analysis and visualization of multidimensional relational databases. The Polaris user interface with explanations from the paper. The Polaris paper laid out several key ideas: Interactive specification of the visualization using a drag-and-drop user interface; the VizQL query language that described both the visualization and the data query; and the ability to live query relevant data directly from its database, eliminating the need to load data files into memory. In 2003, Chris Stolte, Christian Chabot, and Pat Hanrahan founded Tableau based on this work, and developed Polaris from an academic prototype into the company’s first product—Tableau Desktop. Of course, academic prototypes are usually intended to demonstrate an idea not scale to market. To become viable, they had to transform their prototype into a product that could withstand daily use by many different people with various needs, data, and environments. Transforming a prototype into a product that could be shipped was not a trivial undertaking as many technical and product challenges stood between our founders and building a successful company. Dr. Chris Stolte accepting the VIS Test of Time award on behalf of his co-authors, Dr. Pat Hanrahan and Dr. Diane Tang When I joined Tableau in 2004, I was Tableau’s seventh employee jumping back into a developer role after leading engineering teams at another California-based startup. As a young company—even with an incredible new product—we had to constantly knock down technical challenges and think about how to be to be different. We focused on giving people new ways of asking and answering questions they couldn’t easily address with the existing tools they had on hand. That pushed us to figure out how to extend the original technology we had built around VizQL with even more new capabilities, including maps and geocoding, building statistical models, and supporting multiple data sources through blending and federation. This enabled us to leap ahead and show customers there were different and vastly improved ways of working with their data.   These early lessons in innovation still impact and inform everything we do in engineering and development at Tableau today. Early on, we learned to listen to what our customers were trying to accomplish, but we never stopped with only delivering what they asked of us. We also became customers of our own product by running our development team and the entire company on data analyzed with the product we were building. We didn’t want to miss any opportunities for improvements or just build what our customers needed right now. We wanted to reinvent how we could all work with data, then do it again and again, taking ourselves and our customers on a journey past how we were working with data today to a place we thought would be more powerful.     In addition to being our own customer and critic, we knew that as a young, small company we had to demonstrate how Tableau worked and do it fast. We did this by often demonstrating our product using data that our customers provided. This turned out to be a highly effective way to see the almost immediate impact of connecting people to the meaningful insights in their data. In fact, on one sales engagement our former CEO Christian Chabot gave a demo to about 40 people at a customer site. The demo went well, but the group was distracted. Chabot wondered what it could be and asked for feedback. He was told, rather excitedly, that the team was distracted from his demo by the insights Tableau revealed in their data. We learned early on that giving people new ways to do things opens their eyes to better ways of understanding their businesses.   Today, we continue the search for new and better ways to work with data. Whether we are helping customers analyze their data using natural language with Ask Data, or helping them surface outliers and explain specific points in data by leveraging the power of AI in Explain Data, our work in AI only continues to grow now that we’re a part of Salesforce. We recently announced that we are bringing together Tableau with Salesforce’s Einstein Analytics to deliver the best analytics platform out there. This new platform will create even more ways for people to make the most of their data, from improving the quality of insights, to helping them act faster, to enabling smarter data prep and easier sharing. This is just the beginning of our innovations to come with Salesforce as a partner.     Additionally, we are even more committed to making analytics accessible for everyone with our initiatives around becoming a data culture, where data is embedded into the identity of the organization. The World Economic Forum just released a report on the future of jobs with the main message that Covid-19 is accelerating the need for companies to scale remote work, speed up automation, and expand digitization. Old jobs will be lost and the newer ones will demand more advanced digital skills, including using data. In fact, the WEF listed the top in-demand job of the future will be for data analysts and scientists. Establishing a data culture is not an overnight process, but it’s a worthwhile and essential one and we hope our work—especially in programs to promote data literacy—can help everyone explore, understand, and communicate with data.   All these recent efforts build on what we’ve strived to do since the beginning of Tableau—give people new ways of working with their data. The original VizQL work is still the heart of our product and the work we have done since, including building new data platforms and applying good design principles to create highly engaging products. Everything we work on is to build on our mission to help people see and understand their data. We owe a great deal of thanks to the original groundbreaking work in VizQL that has truly stood the test of time.   We’re excited to continue to take that same focus, dedication, and excitement for innovation into the future. Today, as Tableau’s CTO, I’m focused on examining future technologies and product ideas that we can leverage to push our customers’ abilities to work with their data to new heights. And our R&D team remains steadfastly focused on pushing forward with new ideas and how to best turn those into the innovations that will continue to improve Tableau. If you’d like a more in-depth look at our research and development work, please follow our engineering blog. 
Read more
  • 0
  • 0
  • 1070

article-image-tableau-prep-builder-now-available-in-the-browser-from-whats-new
Anonymous
02 Dec 2020
4 min read
Save for later

Tableau Prep Builder now available in the browser from What's New

Anonymous
02 Dec 2020
4 min read
Rapinder Jawanda Spencer Czapiewski December 2, 2020 - 8:46pm December 15, 2020 With the arrival of Tableau 2020.4, we've made exciting advancements for self-service data prep. Now, you can create new Tableau Prep flows as well as edit existing flows directly in the browser. Since all your analytical work can be done conveniently in one spot on the server, web authoring helps analysts eliminates the context switching between creating in the desktop and moving the flow to the server. For IT Admins, web authoring simplifies the deployment experience and provides more visibility into the data prep process, enabling better data management. A simpler, smoother data prep experience for all Web authoring helps analysts by providing an integrated platform to work completely in the browser. You can create data sources, schedule runs, and use those data sources within their workbooks all on your server. No more costly context switching between platforms and tools—everything can now be done in one place, all from anywhere you have access to a browser.  You can create workbooks and flows directly on the web by selecting from the “New” dropdown on the Start page, Explore page, or Data Source page. We have designed the new browser experience to include autosaving as well. When you create or edit a flow in web authoring, your changes are automatically saved in a draft—no need to explicitly save your flow, and no risk of losing work in progress. You will see your changes being saved in the header. Since all your Prep work is now on the same server, everything you do in Prep web authoring is automatically compatible with your version of Tableau Server or Online. Everyone in the organization will get the latest version of Prep Builder in their browser when Server or Online is upgraded. Users only need to have a supported browser on their machine to start creating and editing flows. This means zero installs for users and less work for IT admins.  Update your flows faster Prep web authoring allows you to update flows faster because you don’t have to download the flow, open it in desktop, and then republish the updated flow. Instead, you can click on the “Edit” link and open the flow to make a change directly in the browser. Fewer overall steps and no context switching means increased productivity.  Improved data governance As an IT professional, you will have visibility into all flows that are created or being edited, giving you more control over data and resource usage. Remove unused flows or preserve resources by preventing multiple users from running the same flow. You can even put Prep web authoring on a separate node as part of your scale-out plan.  Prep web authoring allows your flows to be fully integrated with Tableau Catalog, part of our Data Management offering. You get complete visibility into the flows being created and run since all of them are now on the server rather than on individual desktops. With Catalog’s lineage and impact analysis, you can easily track the data sources being created with flows and see which workbooks are using them.  Get started today To get started with Tableau Prep on the browser, simply upgrade Tableau Server to version 2020.4, then enable flows in web authoring. For more information, read about these settings and topology changes. Now you're ready to start creating flows in the browser. Just click “New” > “Flow” on your Explore page and you can start building your flow just like in Tableau Prep Builder! Want to learn more? Interested in more details about Tableau Prep in the browser? For more information, see our Help documentation for Tableau Prep on the Web.  Eager to learn more about how to use Tableau Prep? Head over to the Tableau eLearning site and check out the Prep Builder course!
Read more
  • 0
  • 0
  • 690

article-image-world-statistics-day-the-search-and-need-for-trusted-data-from-whats-new
Anonymous
02 Dec 2020
5 min read
Save for later

World Statistics Day: The search and need for trusted data from What's New

Anonymous
02 Dec 2020
5 min read
Andy Cotgreave Technical Evangelist Director, Tableau Kristin Adderson December 2, 2020 - 1:46am December 2, 2020 Editor's note: A version of this article originally appeared in Information Age.   This year’s UN World Statistics Day theme of “connecting the world with data we can trust” feels particularly timely. The global pandemic has put data at the heart of how the public is informed and persuaded to change behaviors. There has been a huge learning curve for the general public and for governments, with many new public-health statistical systems being built from scratch in country after country, globally. Even though data has become more influential in our lives, people’s level of confidence in using and asking questions of data hasn’t increased. Simply being presented with statistical charts of the pandemic hasn’t made us all more data literate. If the handling and presenting of data during the pandemic has shown us anything, it’s that public citizens, politicians, and the media all need to commit to knowing and interrogating data. This will be even more relevant as the second COVID-19 infection wave affects our economies and we look for signs in the data that the pandemic may be receding. In the spirit of World Statistics Day, what more can governments be doing to improve how they use and present data to the public? Should citizens themselves be responsible for making sure they understand data being presented to them, so they can form objective opinions? What is data without trust? Those in positions of responsibility are facing major challenges when it comes to trusted data use—as the current pandemic shows how important data is for society, politics, and companies. Transparency is vital. This situation also shows that the understanding of data, and related analyses, is not obvious. Do consumers of the insights know where the data comes from, or how it was modeled? Is it clear where there is uncertainty in the underlying data? Is the source data available for others to interrogate? Think back to the “flatten the curve” charts that taught us so much at the start of this pandemic. The images presented two possible outcomes, based on different levels of lockdowns. This type of chart was easy to understand, and they were accompanied by detailed data stories explaining how they worked. By not overcomplicating the narrative, local politicians and media outlets were able to clearly communicate their key messages to the public and, through that clarity and openness, were able to establish a level of trust. As we all came to terms with the new disease, the data—having been presented so well—helped people change their behaviors. Data is at the crux of the decision-making process Over time, as pandemic fatigue has set in, and the reality that science and statistics are uncertain, people have become less trusting of the data presented to them. First off, an inherent problem is that people believe data contains more “truths” about what has happened. This is a fallacy. For a start, all data is messy—even in robust systems. Furthermore, charts are not neutral. Imagine a chart showing an “average number of COVID-19 cases”; I could choose the mean, median, or mode. Or I could choose a 7- or 14-day moving average, with each chart telling a different story. We also read charts as we read an opinion piece in the pages of a newspaper: our own biases affect how we interpret the data. Just like the audience’s own biases, their level of data literacy also impacts the interpretation. All of this is mitigated if data sources are open, skills are always being developed, and a culture of conversation is encouraged. Data literacy should be a core competency Even before the pandemic, it was clear that national data literacy levels should be raised significantly. But this year, COVID-19 has highlighted the ever-present challenge of data literacy both within the wider population, and at the top levels of government and policy-making. At its most basic level, data literacy is the ability to explore, understand, and communicate with data. But in order for a data-led strategy and approach to work effectively on a large scale, more effort needs to be put into considering how to build a Data Culture. Specifically, one that encourages answers and interrogations to a series of fruitful questions about data in society and business. A significant part of the challenge facing government and businesses is to shatter the inscrutability around data, and instill data literacy as a core competency across a far broader cross-section of the workforce. I challenge the government and businesses to do better at making data literacy, and the skills required, both accessible and a priority. By doing this, we will then begin to build a society that is more inclusive, trustworthy, and collaborative with data—ultimately connecting the world through data that we can trust.
Read more
  • 0
  • 0
  • 731

article-image-an-inside-look-at-tableau-virtual-training-from-whats-new
Anonymous
01 Dec 2020
4 min read
Save for later

An inside look at Tableau Virtual Training from What's New

Anonymous
01 Dec 2020
4 min read
Kristin Adderson December 1, 2020 - 1:12am December 1, 2020 Virtual training is something I’m very passionate about because I’ve experienced firsthand how powerful it can be. But it recently occurred to me that if you’ve never taken any virtual training, it’s likely you don’t fully understand what it is. Is it eLearning? Pre-recorded webinars? It isn’t very clear. In our rapidly changing digital world, many learning offers are ‘on-demand,’ and virtual training refers to any learning that can be done online. I’d like to give you a behind-the-scenes look at what you can expect in a Tableau virtual training. When you attend a Tableau virtual training, you get the same in-depth content found in our in-person classrooms, delivered by an exceptional instructor. But the similarities end there. Don’t get me wrong; I love our in-person classes, but the last time I attended an 8-hour, all-day training, it wiped me out. Learning new content for 8 hours a day is exciting but exhausting. In contrast, our virtual training is delivered via Webex and scheduled over a week, typically for 2 ½ hours a day (this varies per class). As an instructor, it’s always so rewarding to see my students progress during the week. I love building rapport with them and seeing them connect with the product and with each other. Students still make connections in a virtual classroom—with classmates across the globe, instead of across the room. A question from someone in Tokyo can inspire someone who is from Denmark. Virtual training averages about ten attendees per session, providing students with the classroom feel and benefits of learning from each other. Activities are scattered throughout the training sessions, providing hands-on practice opportunities to apply what you’ve learned. By having an instructor present, you get your questions answered before you can get stuck. Additional practices are assigned as homework to encourage further exploration. So how are questions handled during virtual training? You simply ask your questions through your audio connection on your computer. We recommend using a headset with a microphone so your questions can be heard clearly. There's a chat option for those who don’t want to speak up in class so you can address your question directly to the instructor or the entire class. Get stuck on an issue while doing homework? Bring your questions to class the next day to discuss during the homework review period. Or, email the instructor who will be happy to guide you in the right direction. Questions aren’t a one-way street in Tableau virtual training. The instructor will use a variety of methods to engage with each attendee. Classes kick off with the instructor and students introducing themselves. This helps build a community of learning and makes it easier to interact with the class when you know a little about everyone. The instructor encourages a high level of engagement and will ask the class questions to track their understanding of the material. Responses can be given through audio, chat, or icons. Polls are sometimes used to validate understanding. All Tableau classroom training is available in a live virtual format. Our most popular classes (Desktop I and Desktop II) are offered in English, Spanish, Portuguese, French, German, and Japanese. Get the virtual advantage Virtual training holds a special place in my heart. Speaking as someone with many years of personal experience with virtual training, I still appreciate what it offers. To recap, here are five reasons why you should consider virtual training: Same content, only more digestible. Virtual training contains the same content as our in-person classes but broken into smaller segments. Learn a little at a time, absorb and apply the concepts, and come back the next day for more. Available anytime, anywhere. Virtual training provides the flexibility to attend classes in multiple time zones and six different languages. As long as you have a strong internet connection, you are good to go. Less disruptive to your daily schedule. Virtual training makes it easy to get that valuable interaction with a live instructor without even having to leave your office (or your house, for that matter). Real-time feedback. No need to struggle on your own. Ask the instructor and other attendees questions, understand different use cases, and get the guidance you need while doing hands-on activities. More practice makes perfect. There’s plenty of hands-on practice time both during class and with extra homework. “Time flies when you’re having fun with data” is an excellent way to describe the Tableau virtual training experience. If you’re looking for a flexible and fun way to gain valuable Tableau knowledge, go virtual. Take it from me, a virtual training veteran. You’ll never look at learning the same way again. Find a virtual class today!
Read more
  • 0
  • 0
  • 707

article-image-our-on-prem-to-cloud-database-migration-a-collaborative-effort-from-whats-new
Anonymous
20 Nov 2020
10 min read
Save for later

Our on-prem to cloud database migration: A collaborative effort from What's New

Anonymous
20 Nov 2020
10 min read
Erin Gengo Manager, Analytics Platforms, Tableau Robert Bloom Manager, Data Science and Data Engineering, Tableau Tanna Solberg November 20, 2020 - 10:35pm November 23, 2020 In our last cloud migration post, we outlined the needs and evaluation criteria that drove us to look to cloud database options. Now, we’re going to discuss the first stage of the migration: moving our data into Snowflake so we could take advantage of its many benefits. Self-service analytics is a delicate balance between enabling users with the data and insights they need to do their work while maintaining effective data governance enterprise-wide. This delicate balance between individual empowerment and centralized control extends to our physical migration of data and Tableau content from one platform to another, as well. Our migration timeline and process framework guided each team so they knew exactly when to join in and transition their data sources from SQL Server to Snowflake. Adhering to this timeline was essential because it was costly to the business, both in infrastructure resources and people hours, to keep SQL Server running in parallel with Snowflake. We intentionally started with the Netsuite pipeline belonging to our Finance Analytics team—a well-governed, well-defined data domain with clear owners to migrate. Starting there, we knew we would benefit from a strong partnership and robust testing scenarios, and that we could iron out the kinks for the rest of Tableau before we performed our full migration. A new way of thinking about data management As we reimagined data management across all of Tableau, we identified five pillars for the  migration process framework that dovetailed well with our Snowflake selection criteria, and would thereby increase trust and confidence in the data that everyone uses. These pillars are: staffing, governance, authentication, communication, and documentation.  We’ll first discuss staffing, governance, and authentication in this post, highlighting some key lessons learned, unexpected issues with responses, and recommendations to consider when migrating and tackling data sets—large or small, simple or complex. Staffing We don’t want to sugar-coat the complex undertaking of any migration at enterprise scale. We started by forming a small, core migration team and quickly realized more assistance was needed to update approximately 9,500 workbooks and 1,900 data sources, and address any downstream content effects caused by differences at the database level. The core team possessed some essential skills we suggest that organizations who make the same journey have: project management; development expertise with Python or a similar scripting language for modifying semistructured data like XML; and Custom SQL savvy. Recruiting talent with the right mix of data and programming skills that we needed was time consuming; we ended up reviewing upwards of 300 resumes and placing dozens of calls. Our central migration team counted at seven people—1.5 full-time program managers for six months, 0.25 server admins, approximately three full-time engineers, and two contractors—supporting upwards of 15-20 domain experts across sales, finance, and marketing. The extended team—data scientists, engineers, analysts, and subject matter experts who work in business teams and helped move or transform data in Snowflake—were the first line of defense when questions or concerns surfaced from business users. These “stewards” of our data were able to answer questions ranging from data access and permissions, to process and timeline questions.  “We were the bridge between IT and finance business users since many data sources were managed by our team,” explained Dan Liang, formerly Manager of Finance Analytics at Tableau (now Manager, Finance Data Office, Salesforce). IT provided the centralized platform and standardization across the enterprise, but Finance Analytics tailored communication for their  end users. “It was all hands on deck for a month as we handled content conversion, testing, and validation of data sources for our team’s migration. Tableau Prep was an integral part of our validation strategy to automate reconcile key measures between Snowflake and SQL Server.” Recommendations: Identify and define roles and responsibilities: Without clear roles, there will be confusion about who is responsible for what specific aspects of the process. In our case, we had data stewards and consumers test the data with specific experts designated to sign off on the data. Automate (where possible): We could have allocated more time to better automate this process, especially around workbook and data source XML conversion, as well as data testing.  Know that you’re comparing apples and oranges: We provided summary statistics like row and column counts and data types to our testers to help them compare the two data sets. But because of many factors (differing ETL and refresh times, plus potential latency), it was very difficult to pin down significant differences versus noise.  Governance Our cloud migration was a golden opportunity to strengthen competencies around governance. Everything from Tableau Server content to naming conventions in Snowflake received fresh scrutiny in an effort to improve user experience and ensure scale. Those teams that invested in governance by establishing single sources of truth (through well-curated and certified, published data sources) had a more straightforward content migration experience. Those that didn’t invest as much in governance struggled with unclear ownership and expectations around data, and their users encountered surprise effects downstream during the migration like broken data pipelines and dashboards.  Because we had many different languages used by data engineers over time, we also conducted thoughtful upfront discussion about standardizing code patterns, including outlining which characters were and weren't allowed. Acting on these discussions, “We implemented Continuous Integration and Continuous Deployment (CI/CD) on our source control tool (GIT), so we could more efficiently peer-review code and transfer work between members of the team as needed,” said Isaac Obezo, a software engineer. “This was much easier than having domain experts do everything in a pipe.” Further strengthening governance, built-in Snowflake features enable transparency into database metadata, including the ability to see and save all queries processed. Since that history is typically only stored for a week, we built a pipeline to store all of the historical data so we could provide more targeted support to end users, create new data curations, and promote our single sources of truth. In finance, we used this data to proactively reach out to users who experienced query timeouts and other errors. It also helped us maintain user access controls around Sarbanes-Oxley (SOX) compliance.   Recommendations: Use data quality warnings: These can communicate the status of a data source to users quickly and easily so they know when migration will happen and what will change. Recognize data management is a marathon—not a sprint: Progress and value deliverables are iterative. We concentrated on delivering smaller, but valuable changes as we migrated to the best model or data. We also benefited from using data to monitor performance of our cloud solution. Below is a sample visualization we built to monitor usage and performance of Snowflake. Minimize tech debt: Tableau Catalog gave us visibility into our data, including lineage and impact analysis to identify content owners. We were able to easily communicate to people what data and content could be deprecated and what was critical to move to Snowflake because of its usage or downstream impact. Consider leveraging an enterprise data catalog to help your end-users build knowledge and trust in data assets. Establish a clear cutoff: Appropriately budgeting time to complete a cloud migration is key; we took an informal survey and estimated an average of five hours per data source and one to two hours per workbook migration. Eventually, a final cutoff must be established where employees no longer have support from the legacy database or from the central migration team. If someone didn’t migrate their data when given ample time and assistance, they likely no longer needed it. Authentication Changing to a cloud based database required changing the database authentication method employed by users, apps, and connected systems. We went from an all On-Premise world of Active Directory (AD) identity management and automatic Windows authentication through AD for users, apps, and systems to the reality of the cloud where identity management across different apps or systems is not seamless or integrated out of the box. The best option is a federated Identity Provider (IdP) with Single-Sign On (SSO) capabilities across different cloud vendors and apps. If you are planning on having multiple cloud based apps, or want users to have a SSO experience, selecting the IdP that works best for you should be done before or in conjunction with your Snowflake adoption. Initially, we connected directly to Snowflake with SAML via our IdP. This works fine, but has pain points, especially coming from the automated world of Active Directory, namely: SAML IdP password changes will require manual embedded password changes in all Tableau content using embedded credentials. In the time between a user changing their password and updating connections in Tableau, any extract refreshes using them would fail and workbooks using their embedded password would not render. The only way to have a seamless password change experience with nothing breaking in Tableau was to switch to OAuth use.  Be sure to check if your IdP can be used with OAuth for Snowflake!  One important lesson learned here was the power of a Tableau email alert.  We worked with IT to automate an email that assists users in the password rotation. One month out from a required password rotation, users receive an email from their Tableau Server Admins reminding them that their password needed to be updated, as well as a link to just that content on Tableau Server. Recommendations: Be prepared to communicate and document changes: When changing authentication types you can expect to receive and have to answer many questions about how it works and how it differs, keeping in mind users’ different degrees of technical understanding. Strategically manage your storage driver: When you’re conducting an enterprise deployment to a platform like Snowflake, it’s important to push out the driver for everyone’s machines to maintain version control and updates. Supporting end-users and content migration Beyond staffing, governance, and authentication, communication and documentation were equally important to guarantee everyone was aligned throughout all phases of our migration to Snowflake. In our next blog of the series, we will explore those critical pillars to enable a better end-user experience and transition so no critical workbooks were left behind.  We also hope that sharing some of the individual experiences of our business teams helps other organizations and our customers better understand what it takes for an enterprise migration. Centralized coordination is mandatory, but business teams and their end-users must be equal partners, contributing from beginning to end.  "We knew we needed a landing place for our data, but didn’t realize how valuable it would be as a platform for collaboration because it was simple and brought everyone, including components across the business, feeding into the same thing,” concluded Sara Sparks, Senior Data Scientist at Tableau. Tableau is now in-tune as our people and data sources are more unified. If you missed it, read the first post in our cloud migration story—we covered our evaluation process for modernizing our data and analytics in the cloud.
Read more
  • 0
  • 0
  • 751

article-image-coming-soon-to-tableau-more-power-simplicity-and-predictive-flexibility-from-whats-new
Anonymous
20 Nov 2020
5 min read
Save for later

Coming soon to Tableau: More power, simplicity, and predictive flexibility from What's New

Anonymous
20 Nov 2020
5 min read
Sarah Wachter Product Management Manager Tanna Solberg November 20, 2020 - 6:46pm November 20, 2020 We were excited to release Predictive Modeling Functions in 2020.3, empowering Tableau users with predictive statistical functions accessible from the native Tableau table calculation interface. We put powerful predictive analytics right into the hands of business users, keeping them in the flow of working with their data. Users can quickly build statistical models and iterate based on the prediction quality, predict values for missing data, and understand relationships within their data.  However, we knew that a significant use case was still challenging. Surprising exactly no one, a key use case for predictive modeling is to generate predictions for future dates. While you can accomplish this in 2020.3 with some complicated calculations, it certainly isn’t easy. We also knew that linear regression, specifically ordinary least squares, isn't always going to be the best predictive model for many data sets and situations. While it's very widely used and simple to understand, there are other regression models that are better suited for certain use cases or data sets, especially when you're looking at time-series data and want to make future projections. We want to make sure that our users have the power, simplicity, and flexibility they need to apply these functions to a wide variety of use cases, and so we're delighted to announce two enhancements to predictive modeling functions. In the 2020.4 release, you'll be able to select your statistical regression model from linear regression (the default option), regularized linear regression, or Gaussian process regression. You'll also be able to extend your date range—and therefore your predictions—with just a few clicks, using a simple menu. With these new features, Predictive Modeling Functions become even more powerful and flexible, helping you see and understand your data using best-in-class statistical techniques. Let's take a closer look at each feature. Model Selection By default, predictive modeling functions use linear regression as the underlying statistical model. Linear regression is a common statistical model that is best used when there are one or more predictors that have a linear relationship with the prediction target (for example, "square footage" and "tax assessment") and those predictors don't represent two instances of the same data ("sales in GBP" and "sales in USD" represent the same data and should not both be used as predictors in a linear regression). Linear regression is suitable for a wide array of use cases, but there are some situations where a different model is better. In 2020.4, Tableau supports linear regression, regularized linear regression, and Gaussian process regression as models. For example, regularized linear regression would be a better model in a situation where there is an approximately linear relationship between two or more predictors, such as "height" and "weight" or "age" and "salary". Gaussian process regression is best used when generating predictions across an ordered domain, such as time or space, or when there is a nonlinear relationship between the predictor and the prediction target. Models can easily be selected by including "model=linear", "model=rl", or "model=gp" as the first argument in a predictive modeling function. Date Axis Extension Additionally, we knew that making predictions for future dates is a critical feature of predictive modeling functions. To support this, we added a new menu option to Date pills that allow you to quickly and easily extend your date axis into the future. While we built this function to support predictive modeling functions, it can also be used with RUNNING_SUM or other RUNNING_ calculations, as well as with our R & Python integrations.  Let's take a look at how these new functions can be applied! First, let's look at how to extend your date axis and make predictions into the future. In the below example, we've already built a predictive modeling function that will predict our sales of various types of liquor. Of course, since this is a time series, we want to see what kind of sales numbers we can expect for the coming months. This is as simple as clicking the Date pill, selecting "Show Future Values", and using the menu options to set how far into the future you want to generate predictions.  Next, let's look at model selection. In the below example, we've already built a predictive modeling function that uses month and category as predictors for sales of various types of liquor. We can see that the default linear regression is capturing sales seasonality and overall trends. However, we can easily switch to using regularized linear regression to see how the regularized model affects the overall amplitude of the seasonal behavior. Since we're building predictions across an ordered domain (time), Gaussian process is also a valid model to use with this data set. In either case, it's as simple as including "model=rl" or "model=gp" as the first argument of the predictive function.  While we've made it very easy to switch between models, for most use cases linear regression will be an appropriate choice. Selecting an incorrect model can lead to wildly inaccurate predictions, so this functionality is best reserved for use by those with a strong statistical background and understanding of the pros and cons of different models. Get started with the newest version of Tableau With these additions, we've significantly expanded the flexibility and power of our predictive modeling functions. Gaussian process regression will let you generate better predictions across a time axis, and regularized linear regression will let you account for multiple predictors being affected by the same underlying trends. Date axis extension gives you an easy, intuitive interface to generate predictions into the future, whether you're using predictive modeling functions or external services like R or Python. Look for these new features in the upcoming Tableau 2020.4 release to get started—and see what else we’re working on.  As always, thank you to the countless customers and fans we've spoken with as we built these new features. We couldn't have done it without you.
Read more
  • 0
  • 0
  • 819
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-looking-back-at-election-2020-the-power-of-online-polling-and-visualization-from-whats-new
Anonymous
20 Nov 2020
9 min read
Save for later

Looking back at Election 2020: The power of online polling and visualization from What's New

Anonymous
20 Nov 2020
9 min read
Steve Schwartz Director, Public Affairs at Tableau Tanna Solberg November 20, 2020 - 4:36pm November 20, 2020 The 2020 presidential election was two weeks ago, but in the world of election data, results are still being processed. Every election is a data story, but 2020 was especially so. As analysts pick apart the accuracy of the polls—and voters decompress from consuming the stream of data stemming from the overwhelming number of mail-in votes this year—Tableau and SurveyMonkey have taken time to reflect on the partnership launched this fall to visualize critical, public opinion data. Through the Election 2020 partnership, SurveyMonkey continuously polled a subset of its nearly 2 million daily survey respondents on a range of topics related to the election—from candidate preference, to likelihood of voting by mail, to concerns about COVID-19. Working with such a robust data set, they were able to break down their data by a number of demographic cuts and visualize it in Tableau, so anyone could analyze the data and understand what factors could shape the outcome this year. Axios, as the exclusive media partner for the initiative, contextualized the data and offered their own analysis. Tableau talked with Laura Wronski, research science manager at SurveyMonkey, about how their online polling data captured the eventual results of the election, the power of data visualization to showcase the complexities in demographic analysis of voter trends, and the effect that key issues—like mail-in voting and COVID-19—had on the outcome. Tableau: As you look back on the polling data you gathered in the lead-up to the election, what is your big-picture takeaway about what your data revealed? Wronski: One thing that we really came to appreciate was the value of having the 50-state Candidate Preference map to visualize our data. We actually feel that we did well in terms of directionally calling the states correctly. We were dead-on in a lot of cases, and the places where we were off, they were oftentimes less than the degree to which other pollsters were off. And when you look at our map, you can see that the states we focused on for the whole election were the ones that proved to be very pivotal. In Georgia, we had a slight Biden lead, and for Arizona as well. We had Nevada very close, though that ended up being more of a Biden state than our data predicted. What’s interesting is that the reason these states are so critical is that the demographics there are changing. The fact that Georgia was competitive and went blue for the first time in many years was fascinating. Our data showed that, but it’s something that also gave us pause as we were putting up those numbers—we really wanted to be confident in the data.  This Candidate Preference map shows the survey responses from the question: "If the 2020 presidential election were being held today among the following candidates, for whom would you vote?"This was a year in which people’s confidence in polling data was ultimately quite shaken. But as you said, your data was pretty accurate. What does that say to you about your methodology of conducting online surveys? That’s something that we've been talking about a lot internally. There were obviously some big errors this year when comparing all pre-election polling to the final outcomes. Wisconsin, for instance, is a state that pretty much everybody got wrong. The FiveThirtyEight polling average for Wisconsin aggregated 72 polls in the two months leading up to the election: only two had a tie, and just one—one of our polls—had a Trump lead at some point. But 69 polls had Biden winning, many of them by a wide margin, and he ended up winning by just 1 percent or so. That means nearly all of the polls overestimated Biden. That is disorienting, because while a two-point error is not a big one, if 10 pollsters all show the same error, it gives people a sense of confidence in the data that didn’t actually pan out. One thing that we have seen through our polling efforts was that because we collect data through online surveys and operate at such a large scale, we’re able to get pretty robust data from small segments and subgroups of people. So we could look at responses just among Black Americans, and we did a story with Axios focused on young voters. A lot of times, these subsets are really hard to see in a 1,000-person national poll. So that is something that we think is an advantage to online polling going forward—particularly as what we’ve seen this year is that it’s hard to get the right mix of people in the underlying sample. The more we’re able to get to a large scale with the data, the more we’re able to look closely at respondents and cut the data by different factors to make sure we’re looking not just at who lives in rural areas, for instance, but that we’re getting the right mix of people who live in rural areas by race and education.  Credit: AxiosAs you’re working with such a vast amount of data and identifying trends, why is visualizing the data so important? Visualization is so useful because it really allows you to see the trends, rather than look at the numbers and get a relative sense for what they’re showing. We built a dashboard that enables people to dig into different demographic groups and really understand differences among them, not just between them. In looking at Black voters, for instance, you’re able to layer in education or gender, and see how more granular subsets fall in terms of candidate preference. And looking at white voters as an entire group, they were the only ones in our dashboard to fall on the Trump side of the margin. But if you add in education, you can see that it was just white voters without a college degree who fell on that side. And if you add in gender, it’s really just men. The more cuts you can do, the more you can see that there are such overwhelming divides along so many demographic lines. There is a temptation to treat [demographic] groups like race as a monolith, but being able to visualize the data and see how different factors layer in encourages people to take a more nuanced approach to understanding voter groups. The way this election unfolded hinged on not just the number of votes, but on the way people voted. What did your polling data reveal about the role that mail-in voting ultimately played in the election? Early on in the process, our data was pointing to what would be a big divergence by party in voting by mail. If you look at our dashboard where you can explore people’s likelihood of voting by mail, you can see a dark purple map indicating the high percent of Democrats who are very likely to vote by mail, and conversely, you can see a deep orange map of the high percentage of Republicans who are not at all likely to vote by mail. That obviously had an effect on the timeline of the election, and the way the results played out on election day and the days that followed. We’re happy that we got the data out there, and we were right on the money in the sense of how much of a story it would be. I think there’s more to think about how we tell the story around how high rates of mail-in voting can affect the timing of results. People are so used to having all the data on election day, but is there a way we can show with data and visualizations how mail-in voting can extend that timeline? Another significant factor in this election was the context of COVID-19. As you were polling people about the election and their preferences, you were also asking respondents questions about COVID-19 and their concerns around the virus. Did you see any correlations in the data between people’s COVID responses and the way the election turned out? Dating back to February, we’ve asked for people’s responses to five questions that relate to their concerns about the coronavirus. And over time, what we’ve seen is that, on the whole, people are more concerned about the economic impact of COVID-19 on the country [overall]. That’s much higher than the number of people who said that they were worried about the economic impact on their own households. Usually the lowest concern is that they or someone in their family will get coronavirus. Rates of concern about the virus were also much lower among white respondents. We’ve seen in our data that, on the whole, Democratic voters were much more likely to say they were concerned about COVID, and Republicans were less likely to see it as a threat—and if they were, it was much more focused on the economy. So it’s clear that people were looking at the macro level, and that the economic impacts, even more than the health concerns, were what motivated voters. As waves of the virus move across the country, it’s useful to track what changes and what doesn’t about people’s opinions. We can see how these concerns impacted what people thought about when voting, and—when you look at mail-in voting rates—how it impacted how they voted. To see more from Tableau, SurveyMonkey, and Axios’s Election 2020 partnership, visit the website.
Read more
  • 0
  • 0
  • 726

article-image-european-businesses-navigate-pandemic-yougov-survey-finds-data-gives-critical-advantage-optimism-and-confidence-from-whats-new
Anonymous
18 Nov 2020
7 min read
Save for later

European businesses navigate pandemic: YouGov survey finds data gives critical advantage, optimism, and confidence from What's New

Anonymous
18 Nov 2020
7 min read
Tony Hammond Vice President Strategy and Growth, EMEA Tanna Solberg November 18, 2020 - 11:43pm November 19, 2020 With a surge of COVID-19 cases triggering a second shutdown in Europe, continued disruption is imminent as businesses face more difficult decisions and challenging realities in the days ahead. We remain in a fight or flight mode, and with that comes added pressure to get things right, to quickly learn from our mistakes, and understand our data. As 2021 approaches, the future remains uncertain, weighing on the minds and hearts of business leaders, but data can be a guide—helping organisations out-perform and out-survive.  Our team in Europe, and frankly around the globe, has seen change, agility, digital transformation, and data accentuated by the pandemic and prioritized by organisations as they navigate a new normal and chart a plan forward. No journey is the same, however. With organisational challenges and shifting customer and business priorities, some businesses are lightly tip-toeing into the age of data while others are already reaping the benefits and building a “memory bank” by learning, testing, and understanding their data. We partnered with YouGov, an international research data and analytics group headquartered in London, to survey more than 3,500 senior managers and IT decision makers in four major European markets: the UK, France, Germany, and the Netherlands. We explored several key questions like: What are the benefits that organisations experience when using and relying on data (especially during the pandemic)?  What lessons have businesses learned thus far, as a result of the pandemic? What will companies prioritise when it comes to future plans and what role will data play? In this blog post, we’ll share top learnings from our research. Explore the full results in this visualization on Tableau Public. The data divide between European businesses Greater optimism amongst data-driven leaders, businesses Nearly 60 percent of survey respondents identified as data-driven, which positively indicates that leaders are prioritising digital acceleration and data transformation regionally. Most (80 percent) of that same group believes that being part of a data-driven organisation puts them at a greater advantage than the businesses who aren’t data-driven. They also have greater optimism about the future of their business because analytics are giving them the clarity to handle obstacles while seizing on the opportunities in their sights.  Those same organisations expressed multiple advantages gained from using data, including: more effective communication with employees and customers; making strategic decisions more quickly; and increased team collaboration for decision making and problem solving, which is essential when new problems surface weekly—ranging in complexity and significance to the business. Now in a second phase of lockdown across many European countries, we can see how data-driven organisations responded effectively the first time, what they learnt (good and bad), and how they’ll apply that as the cycle repeats.  Some organisations benefiting from a data-focused approach, before and during the pandemic, are Huel, a nutritional meal replacement provider based in the UK, and ABN AMRO, one of the world’s leading providers of clearing and financing services. A fast-growing start-up, Huel struggled with delayed decision-making because analytics took too long and required too much effort. By embracing Tableau’s interactive, self-service analytics, they’re democratising data worldwide and creating a data-driven company culture. “Our data-driven strategy is helping us respond to consumer behaviour—enabling us to pivot and react with greater speed and clarity. It’s all about empowering the full organisation through data,” said Jay Kotecha, a Huel data scientist. Speed, high volume transactions, security, and compliance are challenges that global clearing banks face daily—particularly during the pandemic as settlement demand grew 3x the daily average from market volatility. ABN AMRO needed access to accurate data to monitor their settlement process and analyse counterparty risk in real-time and used Tableau analytics to securely explore data and act on insights with speed, agility, and clarity. Organisations that aren’t data-driven at a disadvantage While the YouGov study revealed favourable perspectives with many European businesses, some haven’t fully grasped the value and importance of data. Only 29 percent of respondents who classified themselves as non data-driven, see data as a critical advantage and 36 percent are confident that decisions are supported by data. Furthermore, 58 percent of the non-data driven companies found themselves more pessimistic about the future of their business. They enter the future slightly data-blind because they want to reduce or stop investing in data skills, which means their analysts, IT, and employees are less equipped with data-related resources and their business will likely lag behind competitors who embrace, and therefore thrive, with data.  Key takeaways Data literacy is a priority for businesses that are data-driven, increasing competitiveness Even as some respondents recognise data’s benefits, nearly 75 percent of the data-driven companies across all four markets still see a need to continue (or increase) spending on data skills training and development in the future. “We started building data skills across the business in 2013, and the pandemic has definitely seen us benefit from these capabilities,” explained Dirk Holback, Corporate Senior Vice President and CSCO Laundry and Home Care at Henkel, one of the world’s leading chemical and consumer goods companies based in Düsseldorf, Germany. Also a Tableau customer, Henkel set a strong data foundation before the pandemic hit and was glad that they didn’t let up on data analytics training. Employees now interpret data and apply it to their business area while juggling dynamic regulations, processes, and supply chains. Investment in data literacy creates future success, as we’ve seen with many of our European customers like Henkel, and doesn’t necessarily require large, enterprise efforts. Even smaller, incremental projects that foster data skills, knowledge, and analytics passion—like team contests, learning hours, or individual encouragement from a supervisor to participate in a relevant training—can create a foundation that benefits your organisation for years to come.  Benefits gained from a data-literate, data-driven culture can include: Leaders, business users, and IT who are confidently adapting in real-time and planning for an uncertain future Reduced time to insights  A greater sense of community, enterprise-wide A more motivated, more efficient workforce More informed decision-making with a single source of truth A quicker path to failure...and effective recovery Everyone speaking the same language with increased data access and transparency Cross-team collaboration and innovation on behalf of the business and customers A pivot from data paralysis to business resilience and growth  Agility, swift execution and better quality data is mandatory With all survey respondents, we found three top-of-mind priority areas resulting from lessons learnt during the pandemic. They include: a need for greater agility with changing demands (30 percent), effectively prioritizing and delivering on projects faster (26 percent), and needing more accurate, timely, and clean data (25 percent). We anticipate that in the next 12 months, these areas, amongst others, are where European businesses will focus significant time, attention, and resources. Likewise, they will turn to technology partners who can support this work as they think about and swiftly become digitally, data-driven organisations who both survive and thrive in the face of adversity. The value of data analytics to achieve resilience Will we experience another 12 or six months of disruption? It’s hard to predict the future, but, what we inherently know from observing and listening to customers or prospects, plus talking with tech stakeholders in various industries, is that resilient organisations empower their people with data. This allows them to creatively solve problems, respond to change, and confidently act together. Now, businesses should unite their people with data—to gain a shared understanding of their situation, establish realistic and attainable goals, and to celebrate what might be small wins as they build resilience while facing adversity.  Even if your organisation is less data-driven or feels like it doesn’t have the right expertise, you can take cues from others that found agility and resilience with data and analytics. Becoming data-driven is not out of reach; it’s an achievable goal to strive for with the support of easy-to-use, flexible solutions and resources that will help you quickly start and develop the right culture. To ensure your organisation harnesses the power of being data-driven, consult these resources and simple steps to help you get all hands on data. Learn more about the YouGov research and download the e-book. 
Read more
  • 0
  • 0
  • 703

article-image-electionviz-us-tv-networks-have-room-for-data-storytelling-improvement-from-whats-new
Anonymous
17 Nov 2020
5 min read
Save for later

#ElectionViz: US TV networks have room for data storytelling improvement from What's New

Anonymous
17 Nov 2020
5 min read
Andy Cotgreave Technical Evangelist Director Tanna Solberg November 17, 2020 - 9:04pm November 18, 2020 Editor’s note: A version of this blog post originally appeared in Nightingale, a publication by the Data Visualization Society. How was US general election night for you? For me, it was underwhelming.  That emotion had nothing to do with the political story, but everything to do with the data storytelling of US TV networks. My goal on election night was to enjoy and comment on the way they told data stories, which you can find on Twitter under #ElectionViz. I came away disappointed. The gulf between the charts we find on news websites and US TV networks is enormous. News websites offer sophisticated experiences, whereas the networks offer—well, not much more than screens dominated by geographical shapes of counties in the US. There is not a great deal of difference between this year’s screens and those from 1968. Let’s take CNN as an example. John King, like all anchors on most networks, are amazing commentators. Their knowledge of the US political landscape and their ability to narrate events are hugely impressive. Unfortunately, their words were not supported by visuals that would have made it easier for an audience to follow along. Orange County, Florida map as seen on CNN at 8pm EDT on November 3, 2020. Almost without fail, when an anchor zooms into a county map, they make three data-driven observations: What is the current split between candidates? How many votes have been counted? How is this different from 2016? Also, once the narrator has zoomed into a county, the shape or location of the county is no longer a primary piece of information to focus on. Given that, how easy is it to answer the three questions the narrator needs to answer? It’s not at all easy.  What if we changed the display to focus on the three questions? It could look something like this: A reimagined screen for CNN.The geography is now just a small thumbnail, alongside a vote count progress bar. The candidate’s vote numbers, instead of a text box, are shown as bars. A slope chart on the right shows the swing from 2016.  These are not complex charts: bars and slopes use the most basic building blocks of data visualization, and yet, in an instant, we can see the information the narrator is describing. All the major networks I followed used the same template: map-driven graphics with little thought to the little things that could have greatly enhanced the stories being told. Steve Kornacki on MSNBC did take full advantage of the sports-style telestration board with extensive use of hand-drawn numbers and circles. These enhanced the visual power of his explanations. Steve Kornacki on MSNBC using annotations to enhance his story.Beyond the maps, I was surprised at how few visualizations the networks created. There was the occasional line chart, including a nice one from NBC. It was well laid out, with clear labeling and an identifiable data source. My only quibble was the positioning of the party annotations. It’s always nice if you can put the category label at the end of the line itself. In any TV coverage, it’s only a matter of time before you see a pie chart of some sort. The first I saw was also on NBC. Take a look at this, and try and decode the pie chart. Pay attention to how many times your eye moves across the chart as you do so: Let me guess, your eyes went on a chaotic path across the chart from legend to segment, to numbers, to legend, and so on. How about if we showed this as a bar chart instead? How long does it take to parse the information now? Which is easier and faster to read? The donut or the bars?As I watched the live feeds of the news websites though the night, it was clear that traditional print media are streets ahead in terms of data storytelling. It’s not because their browser-based graphical displays are complex, or that they appeal to data geeks like me. It’s because they consider the questions audiences have and focus the display on delivering answers as quickly as possible. What seems to be missing is the fundamental goal any data storyteller needs to ask: What are the key questions I need to answer? How can I present the information so that those questions can be answered as easily as possible? On reflection, I was surprised by the information design conservatism in the US TV Networks. Comparing today’s coverage to coverage in 1968, other than the addition of color, the displays are still tables of numbers and the odd map. I did #ElectionViz for the UK General Election in December 2019, and the visualization maturity of Sky News and BBC were far further ahead than that of the US networks. As the dust settles and we move towards 2024, I would love to see a little more visual sophistication to support the amazing anchors.   Ok Twitter! It's 3.30am in the UK and I don't think there are any more new charts the media have up their sleeves. So I'm calling it a night for #ElectionViz. Thank you for following, it's been quite the exprience. For now: a whisky to toast you all: pic.twitter.com/4QjZnkHglu — Andy Cotgreave (@acotgreave) November 4, 2020
Read more
  • 0
  • 0
  • 889

article-image-introducing-improved-online-offline-flows-in-tableau-mobile-from-whats-new
Anonymous
17 Nov 2020
4 min read
Save for later

Introducing improved online/offline flows in Tableau Mobile from What's New

Anonymous
17 Nov 2020
4 min read
Shweta Jindal Product Manager Jim Cox Staff Product Manager Tanna Solberg November 17, 2020 - 5:21pm November 17, 2020 Having access to your data while on the go is important for making decisions at the speed of business. But as a Mobile user, you may not always be able to connect to Tableau Server or Tableau Online—perhaps you’re on a plane or visiting a customer site where a network connection may be unavailable. Fortunately, Tableau Mobile provides offline access to all the interactive dashboards and views that you save as a Favorite. These downloaded vizzes are called Previews.    Today, we’re excited to announce that we’re introducing a change in the way that Tableau Mobile shows these previews to create a more seamless and intuitive experience—both when the device is connected and when it’s not.  When the device is connected to the server Current connected experience Up until now, when you launch a favorite view, the preview loads quickly with limited interactivity, and displays a ‘Go Live’ button.  Tapping ‘Go Live’ initiates a server request, and you’ll see a spinner while the view is calculated. You will be switched to the view once it is rendered. We thought this flow would work well for the majority of cases. The preview loads quickly and with sufficient interactivity—you can clearly see the data (pan, zoom, and scroll), tap any mark to see a tooltip, and see highlighted actions when a mark is tapped. Only when you attempt to change a filter value does the app warn that a server connection is required, and instructs you to tap ‘Go Live’. However, in the real world, we have observed that the majority of people tap ‘Go Live’ immediately. If the server is connected, they just want to see the latest version of the interactive dashboard. New and improved connected experience We want to reduce friction in your workflow so you can use the previews in a more helpful way. Now, when you tap on a view from Favorites, we show the preview immediately. We also make the server request for the latest view and load it in the background. There is no ‘Go Live’ button—instead, a banner message lets you know that the latest view is loading. If you haven’t interacted with the preview, the app transitions seamlessly to the latest view when it has loaded, and the banner disappears. With this flow, you see the preview immediately, while the latest view loads in the background—all without experiencing a spinner. Once the latest view is available and you haven’t interacted with the preview, you will be switched to it automatically.   If the latest view is taking a few seconds to load, you can opt to interact with the preview by scrolling or tapping. When this happens, we don’t automatically transition to the latest view—we don’t want to disrupt your flow. Instead, we added a button to the banner. The latest view, that has already loaded in the background, is surfaced when you tap ‘See Latest View’. When the device is not connected to the server Current offline experience Up until now, when you launch a view, the preview loads with limited interactivity. The Go Live button is presented as an option—even if the device is disconnected and can’t go live. When you tap the Go Live button, Tableau Mobile attempts to contact the server but fails and displays an error message. In this case, you benefit from having the preview available immediately, even when offline. But you may not know that the device is disconnected—so tapping ‘Go Live’ and getting an error message is not the greatest experience. New and improved offline experience Now, if you’re not connected, the banner reports that the app cannot load the latest view and offers you a button to see the reason. Tapping ‘See Error Details’ shows an error page explaining there is no server connection. In this case, you can continue to see and interact with the preview, but without a potentially confusing ‘Go Live’ button on the screen. Summary With these new flows, you’ll transition seamlessly to the latest view when connected—without having to ever see a spinner or tap a button. Plus, you can interact with the dashboard while the latest view is loading. And when you’re offline, the new flow shows the available preview without a confusing Go Live button. We hope these changes make using Tableau Mobile a faster and more pleasant experience for all. Download the latest version of the Tableau Mobile App to enjoy this new experience—available on both Apple Store and Google Play. If you have any questions or feedback, please reach out at sjindal@tableau.com.
Read more
  • 0
  • 0
  • 844
article-image-what-data-collaboration-looks-like-and-how-to-achieve-it-from-whats-new
Anonymous
11 Nov 2020
8 min read
Save for later

What data collaboration looks like and how to achieve it from What's New

Anonymous
11 Nov 2020
8 min read
Forbes BrandVoice Tanna Solberg November 11, 2020 - 5:17pm November 11, 2020 Editor's note: This article originally appeared in Forbes. Data is inexhaustible—there’s more of it than we could ever imagine or use. But in important ways, it’s just like conventional raw material in that its value is derived from what becomes of it—not the individual data points themselves. A data stockpile, for instance, is of little interest or use.  It’s why Matthew Miller, product management senior director at Tableau, urges people not to assume that every pretty data dashboard yields useful results. “As much as we love data, and we love insights, insight alone doesn’t transform organizations,” he said. “And no one is measured on how many dashboards they’ve looked at, or how many dashboards they produce, or how many petabytes are in their data warehouse. It’s about driving organizational performance.” By putting people in the right circumstances—with the right tools and access to the right data—organizations can do amazing things. That was clear in the early months of the Covid-19 response, when organizations with people who could make informed decisions, despite uncertainty and rapidly changing circumstances, had an advantage. Even as organizations compressed years of digital transformation roadmaps into the span of just a few weeks or months, they were able to make choices that outmaneuvered competitors in the most crucial moments. Leaders can build on those lessons by laying the cultural foundation for productive and valuable collaboration around data. It’s there, and not in data lakes or stylized visuals, that the real coin is minted. To understand data collaboration, think about caller ID In the beginning of wired phone service, phone companies kept meticulous databases of customers and phone numbers. They used this data internally to bill customers, route calls and provide value-added services, including phone books and operator-assisted number lookup. Monthly bills would typically list every number called over the span of a month in case that look-back analysis might be of use. Then caller ID came onto the scene. Caller ID didn’t create any new data. It simply presented existing data to a user in a timely fashion at a decision point: Tell me who is calling so I can decide if I want to answer. Telephone users didn’t have to make a big change to their workflow to use this information—today it appears on a phone’s screen as a matter of course. Nobody has to push additional buttons or perform a special data lookup. The end user gets a valuable piece of information at the precise moment they have reason to ask. This outcome should be a goal of every organization seeking to instill a productive data culture, said Richard Starnes, principal in Deloitte’s analytic and cognitive offering, who consults with Deloitte clients on analytics and business-intelligence solutions.  “Turning into a data-driven organization means you have got to figure out a way to get that data unencumbered into the hands of the people that can be creative and effective with it,” he said. Hallmarks of data harmony Getting data into the hands of people who will know exactly what to do with it starts with building sensible workflows. And for an organization to use data persuasively, the output may come closer to resembling a snapshot than a complex workbook. Here are three traits that are found within data-driven companies: A Virtuous Cycle Of Input And Output. Miller recommends boiling down the analytical cycle to a repeatable process: A piece of information provokes an action, which kick-starts a process, which produces a new piece of data, which provokes an action. Analytics processes that don’t fit this mold are possible and can be valuable, but if those are the rule rather than the exception, it may be a sign that low-hanging fruit is being left unpicked. Data Workflows That Reflect How People Already Work. The best data-driven processes should support and enhance the jobs and responsibilities of the people engaging with them, and simplify the typical problems they are trying to solve on a daily basis. “Human patterns of collaboration illuminate how to design data systems for harmony,” said David Gibbons, senior director for analytics at Salesforce. “The shape of the data can sometimes help you to understand who your team is going to connect and interact with in order to complete their work more effectively. And a flexible data platform that lets you embed analytics wherever they are needed in the middle of those collaborations will maximize success and increase data harmony in the process.” Insights That Are Easy To Consume. Instead of using complex, multi-tab workbooks to express key findings, some organizations are producing simple dashboards that are easily consumable for every level of understanding. They’re also taking advantage of features that allow them to track key metrics, kind of like how you would track stocks in an investment portfolio. And for those who are primarily focused on answering “what should I do with this information?”, artificial intelligence can help translate complex data into immediate next steps. “Not every business user wants to see the full data set. AI features are now built into BI [business intelligence] platforms, so users can get specific recommendations to help them make faster decisions. This results in benefits like closing deals faster or resolving customer cases with higher satisfaction,” Gibbons said.  Leader’s checklist: 4 steps toward data success Data success comes more easily if senior leaders make it a priority and show in their own work how analytics should be used to drive business outcomes. 1. Share and leverage the knowledge of others One of the obstacles to effective collaboration around data is grading yourself against your peers. In other fields, it’s easy to rate and clone processes that lead to low manufacturing defect rates or efficient supply chain execution. Data is more slippery, and so the sources and skill sets that serve one team or company best might not work the same way at another.  Overcome these challenges by ensuring that your data collaboration efforts are as inclusive as possible, bringing collaborators into the fold early to discuss and improve processes and outcomes. Starnes said IT is typically best positioned to support the bottom-up work of making data platforms efficient, effective and credible, while business leadership can work from the top down to put the money and strategic support behind the last mile of data delivery and collaboration. 2. Cater to a wide range of knowledge and talent Giving every employee access to seemingly limitless data sources and analytical tools won’t turn them all into equally effective knowledge workers. Although that approach may serve trained data scientists and can help data savants bubble up to the surface, it’s not the most effective or coordinated way to collaborate.  “There are plenty of organizations that have failed despite having massive data warehouses and big analytics investments,” Miller said. Instead, ask your talented employees to articulate the problems they need data to solve, and have the experts focus on ways to help them.  3. Create spaces where contributors can seek advice Instead of putting more training hours on people’s calendars, create spaces that encourage employees to ask questions. These can be virtual spaces for teams, or drop-in clinics, of sorts, with a rotating cast of collaborators who can bring a wide range of skills to the table. Whatever platform is used to create the space where a data community can convene, it’s crucial that it’s flexible. As we learned from this year’s sudden shift away from commuting, business travel and office usage, data consumption trends and analytical needs will change. In 2019, trends emphasized pushing more data to smaller screens and mobile devices. But over the past several months, the share of data consumed at desktop screens has climbed considerably.  4. Make data easy to question and validate Many dramatic stories of data and analysis focus on a game-changing realization or an unanticipated surprise. The real world is more prosaic. The truth is that data is frequently used to confirm well-founded intuitions and assumptions.  “Most executives have a gut sense of how they’re doing, and when they see the analytics, they aren’t often all that surprised,” Miller said. “So if they see a number and wonder where it came from, you need to be able to track it back to the source.” One effective way to ensure that happens is to put a human face on every piece of data and analysis exchanged. You can do this by certifying data sources—effectively putting a mark of approval to show the data is up-to-date and trustworthy. And there should be cultural support and incentives for providing timely responses and explanations, so that decisions aren’t stalled and insights aren’t discarded by users who don’t have time to wait for the explanation. Discovering the crucial facts and most valuable insights is a collaborative process.  Ask contributors how data and analysis played a role in a recent success. Discuss the most-loved and least-loved data experiences and search for common threads.  Visit Tableau.com to learn how to empower more people with data and explore stories of data-driven collaboration.
Read more
  • 0
  • 0
  • 729

article-image-tune-in-to-use-collections-playlists-for-your-data-from-whats-new
Anonymous
10 Nov 2020
5 min read
Save for later

Tune in to use collections, playlists for your data from What's New

Anonymous
10 Nov 2020
5 min read
Ann Ho Senior Product Manager, Tableau Tanna Solberg November 10, 2020 - 4:58pm November 10, 2020 Take a look at what you have in your music library. You’ve got songs from different albums, artists, and even genres. You probably have music for different moods, activities, or times of day. Just like with songs, the data you use isn’t always contained in the same project on your Tableau site. And you might have trouble remembering where to find the assets you don’t use regularly. With our new collections feature, you can gather the data from across your site and organize them to fit how you use them—just like playlists!  If you’re a Tableau Online customer interested in getting an early look at collections, you can sign up to join our Tableau 2020.4 Collections Limited Preview Program. We’ll reach out to enable collections for your site so your users can try it out. Organize your data the way you think about your data Many customers model their Tableau sites and projects after their organizational structures: by departments, regions, or a nested hierarchy of both. But when it comes to using data, most people collaborate with other teams and departments. Often, this means navigating in and out of different projects to get to the right data and content.  Collections introduce the ability to curate content from across various projects—helping users navigate and find the content they need in the context of how they need to use it. You can promote specific content to help new users find relevant data, align content to workflow processes for easier reviewing or archiving, and even use collections in meetings and quarterly reviews to ensure everyone is referencing the same dashboards.  Collections function as lists—you aren’t making copies of your assets, moving them in or out of their project folders, or changing any security permissions. You can keep a collection private just for you, or make it public so others can search for and use it, too. And the same content can be added to different collections, helping keep data conversations centered around a single source of truth. Getting started with collections Once collections are enabled for your Tableau Online site, you’ll see a new option in the left navigation for collections. Here, you’ll find all the collections you have access to—the ones you own and the ones in your site that have been made public. You can also see the collections you own on a tab in My Content. As long as you’re a licensed user on a site, you can create a collection! Click the New Collection button at the top of the collections page to create your collection and edit the default name. As you browse through projects, add content to your collection right from the item’s action menu. Or, select multiple assets and use the multi-select action menu to add them all to a collection. You can add many kinds of data assets from your site to a collection, including workbooks, Prep flows, and even data roles. Currently, you can’t add a collection to another collection, or add custom views, databases, and tables. Sharing collections Just like playlists, you can share collections and help others find the data you’ve curated for specific projects, meetings, or tasks. By default, collections are made private so that only you (and administrators) can see them. To share a collection, you’ll need to change its permissions to public—this allows anyone on the site to browse and find it. You can also choose to send a note to let your colleagues know about the collection. When you use the Share button to send a link to your collection, the recipients will get an email. Plus, the collection will appear in their Shared with Me channel on the Home page in Tableau Online to make it easy for them to find again. Even though the collection is public, the security permissions for each of the items within it won’t change—people will only see the data they have access to. Try collections for yourself—join our limited preview! We’re excited to release this new feature to all of our customers, but first we’d love to hear from you! If you’re a Tableau Online customer and are interested in getting an early look at collections, you can learn more about how to use collections and sign up to join our Tableau 2020.4 Collections Limited Preview Program. We’ll reach out to enable collections for your site so your users can try it out. Your feedback is important to us to better understand how you engage with data so we can build experiences that serve your needs—so thank you! We can’t wait to see all the great ways you’ll use this new feature.
Read more
  • 0
  • 0
  • 762

article-image-best-of-the-tableau-web-learnings-from-tcish-and-iron-viz-from-whats-new
Anonymous
09 Nov 2020
4 min read
Save for later

Best of the Tableau Web: Learnings from TC(ish) and Iron Viz from What's New

Anonymous
09 Nov 2020
4 min read
Andy Cotgreave Technical Evangelist Director, Tableau Tanna Solberg November 9, 2020 - 5:51pm November 9, 2020 Phew! It’s November. Welcome back to Best of the Tableau Web, where I share some of the best community content I’ve seen in recent times. And since we’re right on the heels of our first ever virtual Tableau Conference(ish), there’s a lot to highlight!  For this segment, three themes come to mind.  First: Iron Viz is the ultimate data competition and it was a privilege to host this event with Keshia Rose again. Going virtual this year presented challenges, but also many new opportunities. As I write, we’re in the midst of releasing special episodes of our weekly livestream series, If Data Could Talk, with all three finalists. In each one, we’re pressing play on their 20-minute Iron Viz build and getting the inside scoop on all their data decision making. I’m really excited about these episodes because there’s a huge amount to learn from them for data viz enthusiasts of all skill levels. You can watch the episodes live for free on all our social media channels (no sign-up needed!) so make sure to follow Tableau on Twitter, LinkedIn or Facebook. Watch the episode with Alex Jones here. Watch for upcoming episodes with Simon and Christian, coming up on the 12 and 19th of November. Christian has also done a fantastic recap of his experience on his own blog if you want a sneak peak. Second: Let’s talk Tableau Conference(ish)! It’s always a pleasure reading people’s recaps of Tableau Conference for specific learnings, key themes they take from the event, and above all else, inspiration around the Tableau community. I particularly enjoyed these thoughtful perspectives from Adam Mico, Sarah Bartlett, and Dustin Wyers.  The third and final theme is yet another focus on what must be the most revolutionary feature we’ve added to desktop in years: Set and Parameter Actions. These features continue to open up new paths to interactivity that astound me. Keith Dykstra explains how to simplify them, Spencer Baucke provides an overview of Set Controls, Ethan Lang shares ways to use them with secondary data sources, Steve Wexler applies them to survey data, and Andrew Watson uses them to build some great Viz in Tooltips. This functionality seems to have endless possibilities: I say keep ‘em coming.  As always, enjoy the list below! Follow me on Twitter and LinkedIn as I try and share many of these throughout the month. Also, you can check out which blogs I am following here. If you don’t see yours on the list, you can add it here. Tips and tricks Andrew Lang Building a Relationship with Scaffolding Andy Kriebel #TableauTipTuesday: Four Methods for Creating Dots on a Map Jim Dehner FAQ Series - Fiscal Years Rosario Gauna Write-back for Everyone: Parameter Query Language for Tableau (Part 1) Anthony Smoak Understanding Tableau Context Filters Jess Hancock Full Outer and Inner Joins with Multiple Inputs: The ‘Join Multiple’ vs ‘Manual’ Method Inspiration Spencer Baucke Data + Love Podcast Brandi Beals Advance Your Career with Tableau (for FREE) Formatting, Design, Storytelling Rajeev Pandey 30 Best Design Resources for Your Next Tableau Dashboard Beth Kairys Virtual Event Recap: The Good and Bad of Dashboard Templates Matthew Whiteley Tableau 2020.3: Best New Features for Dashboarding Jeffrey Shaffer Dashboards Done Right: Insurance Dashboard by Ellen Blackburn  Matthias Think twice – is it important, is it attractive? Luke Stanke KPI Design Ideas for Tableau Tamas Varga Creating Charts within a Hexmap Calculations Eric Parker Nested IF Statements in Tableau Daniel Caroli Tableau Date Calculations: When Close is Good Enough Igor Garlowski Interactive Date Comparisons with Tableau Parameters James Goodall Tableau Bitesize: Buffer Calculations Prep Spencer Baucke Tableau 2020.3 – Write to Database in Tableau Prep Igor Garlowski Local Scheduling and Flow Refreshes with Tableau Prep Server Mitchell Scott Switching from Core to Role-Based Licensing in Tableau Server Rowan Bradnum Increasing Tableau Server User Adoption: Five Ways to Earn Quick Wins Timothy Vermeiren Tableau Server Webhooks, REST API and Slack: a learning experience Carl Slifer Increasing Tableau Server User Adoption: Improve Data Quality & Content Set and Parameter Actions Keith Dykstra Simplified Parameter Actions in Tableau Spencer Baucke Tableau 2020.2 – Set Controls Ethan Lang How to Use Secondary Data Sources for Tableau Parameter Actions Steve Wexler Set Controls and survey data – how to compare responses for this group vs that group vs overall Andrew Watson Use Set Action for Viz in Tooltip Grand Totals Tableau Conference(ish) Adam Mico The (ish) Experience: My 1st Tableau Conference  Sarah Bartlett Tableau Conference….ish #Data20 Highlights The Datum Podcast S3 E9: Byte: Tableau Conference 2020 roundup Dustin Wyers Tableau Conference 2020: Highlights from Devs at Desks
Read more
  • 0
  • 0
  • 755
article-image-locked-nested-projects-provide-unprecedented-flexibility-in-governing-your-site-from-whats-new
Anonymous
05 Nov 2020
4 min read
Save for later

Locked nested projects provide unprecedented flexibility in governing your site from What's New

Anonymous
05 Nov 2020
4 min read
Mark Shulman Product Manager, Tableau Kristin Adderson November 5, 2020 - 8:48pm November 6, 2020 It’s okay to admit it—when you heard that Tableau introduced locked nested sub projects in 2020.1, it may not have given you goose bumps or sent shivers down your spine. But, we are here to say that if you’re responsible for governance and structuring yours Tableau site, it may be one of the most powerful features to come along in quite awhile. This “little” feature is easy to overlook, but has a big positive impact on minimizing the needs for additional sites, delegating admin responsibilities, and providing the flexibility that your organization needs. How do locked nested projects change my site management? Mark Wu, Tableau Zen Master, stated, “Sub projects can now be locked independently. This is a game changer for all of us!” To understand why this is a big deal, let’s go back to pre-2020.1 days. When you locked a project, the entire locked parent/child hierarchy underneath it had the exact same permissions. This limitation required you to either have a very broad, flat tree structure—or you may have worked around it by spawning unnecessary, new sites. You don’t need to do that anymore! Before: Pre-2020.1 the world was broad and flat. Locked projects drove the same permissions down to all its sub projects which triggered the proliferation of more top-level projects. User navigation of the site is much more challenging with so many projects. This limitation also led some to stand up additional sites, which has the downside of creating a “hard” boundary to sharing data sources and can make collaboration more challenging. Sites require duplicate project hierarchies, increasing the effort to create, permission, and manage across them. After: Your site structure reflects your organization’s depth. You can now lock a project at ANY level in your site’s project folder structure regardless of whether the parent is locked with different permissions.  That allows you maximum flexibility to structure and permission your site in ways not possible before Tableau 2020.1. Why would I want to use locked nested projects? Many organizations want to manage their content and permissions in ways that mimic their organizational structures. Think of all the potential benefits. Now you can empower project leaders or owners to lock sub projects with the permissions that meet their specific group needs at any level in the hierarchy. You free up admin time by delegating to the folks closer to the work, and help your Tableau site to be better organized and governed. Locked Nested Projects simplifies permissioning by allowing you to: Lock sub projects independently Simplify top-level projects Group similar projects together Ensure access consistency Ease admin burden by delegating to project owner/leaders Organizations aren't flat, and neither is the way you govern your content. You can now organize your Tableau site and projects exactly the way you want—by department, region, content development lifecycle, or perhaps a combination. Below are three common examples of organizing projects that can take advantage of Locked Nested Projects. Each color represents different group access that could be locked in place at the sub project level and below. Overall, we strongly recommend using locked projects along with group permissions rules to help ease the management of a Tableau site. Unlocked projects can promote a wild west culture, where everyone manages their own content permissions differently. In contrast, locked projects ensure consistency of permissions across content and provide the ability to delegate the admin role to project owners or leaders who know the appropriate details for each group’s content.  How do I use locked nested projects? You can apply nested projects to an existing project hierarchy regardless of when you created it. Click the three-dot Action menu - Permissions... for a project. Click the Edit link in the upper left of the Permissions dialog. Click Locked and the check Apply to nested projects. It’s very easy to overlook the new check box when you click to edit the locked settings for a project. Where can I find out more about locked nested projects? There are many resources available to get you started with locked nested projects: Tableau Help: Lock Content Permissions Tableau Help: Use Projects to Manage Content Access Tableau Blueprint: Content Governance Tableau Community: V2020.1 Nested Project Permission (Thanks to Mark Wu)
Read more
  • 0
  • 0
  • 851

article-image-five-things-data-storytellers-can-learn-from-2020-us-election-poll-trackers-from-whats-new
Anonymous
28 Oct 2020
7 min read
Save for later

Five things data storytellers can learn from 2020 US election poll trackers from What's New

Anonymous
28 Oct 2020
7 min read
Andy Cotgreave Technical Evangelist Director, Tableau Tanna Solberg October 28, 2020 - 9:19pm October 30, 2020   In the latest episode of Chart Chat, Amanda Makulec, Steve Wexler, Jeff Shaffer and I discussed the lessons data storytellers can learn from 2020 US election poll trackers. Below are my five main takeaways that you can apply in your own analyses. To get the details, watch the full episode of Chart Chat. Embrace uncertainty when making predictions   Figure 1: Visualization from FiveThirtyEight showing potential outcomes of the 2020 US presidential election.Predictions, even based on clean data sets, are uncertain. In 2020, many poll trackers emphasize this, possibly as a result of perceived errors in 2016. They do not want viewers to come away with an expectation that one candidate or another “will” certainly win. Poll trackers are embracing uncertainty in various ways. The New York Times (NYT), CNN, and others, ask you to make decisions on the outcome. Instead of calling toss-up or swing states on your behalf, they force you to interact with the visualization and name discover these swing states for yourself. By engaging the user, they emphasize the fact that these are predictions—not mandated outcomes.  When you open FiveThirtyEight’s forecast pages, you see multiple US maps, each showing a different outcome. This is an arresting sight on first look that is, I think, very powerful. You can’t help but stop and process the meaning of what you’re seeing: multiple possible outcomes based on running multiple simulations. Further down the page, their main predictive plot is a chart with 100 dots showing representative outcomes of 40,000 simulations. You can’t come away from this without knowing you are seeing a range of possible outcomes. Takeaway #1: When you communicate predictions in your organization, do your audiences understand the levels of uncertainty they contain? Slow data adds context and fosters engagement Figure 2: Visualization from the New York Times showing the swing states Trump and Biden need to win to reach 270 electoral votes.   When working with data, we are often too quick to jump to a conclusion and then share an insight, perhaps without really digesting the meaning. A clear trend for this election is how the trackers force a more meaningful engagement. “Slow data” as a concept was first proposed by the New York Times back in 2011, and further refined for the analytics industry by Stephen Few in 2013. More than ever it seems that these ideas are ones we should embrace.  Fiverthirtyeight first shows you an unusual set of maps. As you scroll, you start with a written analysis, not a chart. They are encouraging you to engage in the context of the data. Only after those two sections do you reach the predictive plot. In previous elections, this chart would’ve been at the top of the page, intended for your rapid consumption. The NYT’s drag-and-drop visualization that also forces  you to interact is another example of engaging readers: you can’t reach a result until you have acted. Although we didn’t discuss it in Chart Chat, the Financial Times also adds a lot of commentary among their poll tracker charts: As you look at their tracker page, each chart is interspersed by commentary that is updated regularly. All these examples bring people back from the brink of making conclusions too quickly, and encourage people to engage more deeply with the information. Takeaway #2: What ways do you provide additional context to your data displays? Do you encourage people to consider more than just the height of a bar on a bar chart or the slope of a line chart? Cutesy or dry? Novel or complex?     Figure 3: Visualization from FiveThirtyEight showing a sample of 100 outcomes of the 2020 US presidential election. One of the biggest challenges for any data communication is whether to make it dry and functional, or cute and engaging? Both are valid choices, but you must have reasons for choosing either path. FiveThirtyEight has leaned heavily towards a cutesy and engaging approach. The Fivey Fox mascot pops up next to most charts with call-outs to more further information. Their charts have a playful design, including caricatures of Trump and Biden.  We compared how CNN’s “Road to 270” tracker is essentially the same as the NY Times tracker In both cases, you choose how the states will cast their electoral votes and watch as progress towards one candidate having 270 electoral votes changes. The NY Times chart has bouncy circles that you drag into a cute Voronoi circle chart. The circles pop around and it’s fun to play with. CNN has you click on a map which changes colors of a horizontal stacked bar. Functionally, the two charts are the same, but one is playful while the other is functional. I find the NYT’s draggable bubbles a middle ground between cute and dry. The way they bounce as you let them go is engaging. Most other organizations choose a more austere, dry path. Is dry better than cute? I cannot answer that because it mostly comes down to personal preference. Personally, I’ve stopped noticing Fivey Fox now I’ve seen him many times, but Amanda said she likes him and thinks his inclusion brings a moment of levity to what is a very serious subject. Takeaway #3: Which approach do you take for your internal or external data communications? Do you consider whether to bring levity to your insights, or stay austere and serious? Geospatial data doesn’t have to be a map       Figure 4: CNN showing an electoral map that can be adjusted to simulate changes to who receives a state's electoral votes. There’s another reason to compare the CNN and NYT visualizations. CNN chose to encode their US state data on a map. NYT put them in circles based on whether they will vote Trump or Biden. CNN’s approach means you can readily find any given state, if you know US geography. If you want to look up a specific state, it’s easy. In the NYT chart, it is not easy to find any particular state, requiring you to read through state abbreviations. This doesn’t mean the NYT got it wrong. By putting states beneath a candidate, and making the circle bigger or smaller, you can more easily see which candidate is more likely to win. A map doesn’t afford dynamic changes like that. These are decisions we always have to make when working with geospatial data: a map is always tempting, but sometimes there are options that reveal different insights. Takeaway #4: Before you make a map with your geospatial data, ask yourself if it’s the best way to share your insight. Data geeks love to hunt for historical Easter eggs We talked at length about “The winding path to victory” snake chart on the FiveThirtyEight website. It’s an unusual method of displaying data. Personally, I find it a bit confusing, but during Chart Chat, others pointed out that the novelty is a good way to draw in beginners. I recommend watching that segment because the discussion was fascinating. The snake chart has been a long-standing part of their poll tracking, but Steve called out its similarity to Benjamin Franklin’s Join or Die cartoon from 1754. A conversation on Twitter has, at time of writing, yet to determine if the similarity is by design. Either way, I hope it is. Our field of visualization sits on the shoulders of giants. The 18th century was a period of great experimentation in communicating data, and much that was developed then became part of what we now call “best practice.”  Takeaway #5: Go learn about the history of our field. It is fascinating to dig into! That is the list of our takeaways from the election poll trackers we’ve been reviewing. I’d love to know what other lessons you’ve found. If this is interesting, make sure to follow me on Twitter (@acotgreave) during the night of the US presidential and general election, when I’ll be doing live commentary of the best and worst data visualizations from TV networks: make sure to follow the hashtag #ElectionViz. Also, check out the election projects Tableau has been doing with Survey Monkey and Axios.
Read more
  • 0
  • 0
  • 687