Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-updating-my-kubernetes-raspberry-pi-cluster-to-containerd-from-blog-posts-sqlservercentral
Anonymous
03 Dec 2020
4 min read
Save for later

Updating my Kubernetes Raspberry Pi Cluster to containerd from Blog Posts - SQLServerCentral

Anonymous
03 Dec 2020
4 min read
There’s been a lot of conversations happening on twitter over the last couple of days due to the fact that Docker is deprecated in Kubernetes v1.20. If you want to know more about the reason why I highly recommend checking out this twitter thread. I’ve recently built a Raspberry Pi Kubernetes cluster so I thought I’d run through updating them in-place to use containerd as the container runtime instead of Docker. DISCLAIMER – You’d never do this for a production cluster. For those clusters, you’d simply get rid of the existing nodes and bring new ones in on a rolling basis. This blog is just me mucking about with my Raspberry Pi cluster to see if the update can be done in-place without having to rebuild the nodes (as I really didn’t want to have to do that). So the first thing to do is drain the node (my node is called k8s-node-1) that is to be updated and cordon it:- kubectl drain k8s-node-1 --ignore-daemonsets Then ssh onto the node and stop the kubelet: – systemctl stop kubelet Then remove Docker: – apt-get remove docker.io Remove old dependencies: – apt-get autoremove Now unmask the existing containerd service (containerd is used by Docker so that’s why it’s already there): – systemctl unmask containerd Install the dependencies required:- apt-get install unzip make golang-go libseccomp2 libseccomp-dev btrfs-progs libbtrfs-dev OK, now we’re following the instructions to install containerd from source detailed here. I installed from source as I tried to use apt-get to install (as detailed here on the Kubernetes docs) but it wouldn’t work for me. No idea why, didn’t spend to much time looking and tbh, I haven’t installed anything from source before so this was kinda fun (once it worked). Anyway, doing everything as root, grab the containerd source: – go get -d github.com/containerd/containerd Now grab protoc and install: – wget -c https://github.com/google/protobuf/releases/download/v3.11.4/protoc-3.11.4-linux-x86_64.zip sudo unzip protoc-3.11.4-linux-x86_64.zip -d /usr/local Get the runc code: – go get -d github.com/opencontainers/runc Navigate to the downloaded package (check your $GOPATH variable) mine was set to ~/go so cd into it and use make to build and install: – cd ~/go/src/github.com/opencontainers/runc make make install Now we’re going to do the same thing with containerd itself: – cd ~/go/src/github.com/containerd/containerd make make install Cool. Now copy the containerd.service file to systemd to create the containerd service: – cp containerd.service /etc/systemd/system/ chmod 644 /etc/systemd/system/containerd.service And start containerd: – systemctl daemon-reload systemctl start containerd systemctl enable containerd Let’s confirm containerd is up and running: – systemctl status containerd Awesome! Nearly done, now we need to update the kubelet to use containerd as it defaults to docker. We can do this by running: – sed -i 's/3.2/3.2 --container-runtime=remote --container-runtime-endpoint=unix:///run/containerd/containerd.sock/g' /var/lib/kubelet/kubeadm-flags.env The flags for the kubelet are detailed here I’m using sed to append the flags to the cluster but if that doesn’t work, edit manually with vim:- vim /var/lib/kubelet/kubeadm-flags.env And the following flags need to be added: – –container-runtime=remote –container-runtime-endpoint=unix:///run/containerd/containerd.sock OK, now that’s done we can start the kubelet: – systemctl start kubelet And confirm that it’s working:- systemctl status kubelet N.B. – Scroll to the right and we can see the new flags Finally, uncordon the node. So back on the local machine:- kubectl uncordon k8s-node-1 Run through that for all the worker nodes in the cluster. I did the control node as well following these instructions (didn’t drain/cordon it) and it worked a charm! kubectl get nodes -o wide Thanks for reading! The post Updating my Kubernetes Raspberry Pi Cluster to containerd appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 768

article-image-the-pace-of-innovation-never-rests-how-lessons-from-our-past-still-influence-us-today-from-whats-new
Anonymous
03 Dec 2020
7 min read
Save for later

The pace of innovation never rests: How lessons from our past still influence us today from What's New

Anonymous
03 Dec 2020
7 min read
Andrew Beers Chief Technology Officer, Tableau Kristin Adderson December 3, 2020 - 3:27pm December 3, 2020 Everyone is talking about the need for innovation these days, but there are a lot of questions about the best ways to move forward. Even before the Covid-19 crisis hit, McKinsey found that 92 percent of company leaders thought their business models wouldn’t stay viable at the then-current rates of digitization, and the pandemic has only accelerated this need for rapid innovation in the digital world.    As we’ve helped several customers navigate the uncertainty and find solutions, we always go back to what’s at the core of innovation at Tableau. A recent event was the perfect opportunity to pause and look at how we’ve also weathered uncertainty and increased our pace of innovation throughout the history of Tableau—and how these lessons still serve us today.   The IEEE VIS conference is the premier forum for academic and applied research in visualization, bringing together an international community to share ideas and celebrate innovation every year. It also hands out the Test of Time Awards honoring work that has endured and remained relevant for at least a decade or longer after its initial publication. This year, Tableau co-founders Chris Stolte and Pat Hanrahan, with their former colleague Diane Tang, received the 20-year Test of Time Award for their groundbreaking research underlying Tableau, a paper titled Polaris: a system for query, analysis and visualization of multidimensional relational databases. The Polaris user interface with explanations from the paper. The Polaris paper laid out several key ideas: Interactive specification of the visualization using a drag-and-drop user interface; the VizQL query language that described both the visualization and the data query; and the ability to live query relevant data directly from its database, eliminating the need to load data files into memory. In 2003, Chris Stolte, Christian Chabot, and Pat Hanrahan founded Tableau based on this work, and developed Polaris from an academic prototype into the company’s first product—Tableau Desktop. Of course, academic prototypes are usually intended to demonstrate an idea not scale to market. To become viable, they had to transform their prototype into a product that could withstand daily use by many different people with various needs, data, and environments. Transforming a prototype into a product that could be shipped was not a trivial undertaking as many technical and product challenges stood between our founders and building a successful company. Dr. Chris Stolte accepting the VIS Test of Time award on behalf of his co-authors, Dr. Pat Hanrahan and Dr. Diane Tang When I joined Tableau in 2004, I was Tableau’s seventh employee jumping back into a developer role after leading engineering teams at another California-based startup. As a young company—even with an incredible new product—we had to constantly knock down technical challenges and think about how to be to be different. We focused on giving people new ways of asking and answering questions they couldn’t easily address with the existing tools they had on hand. That pushed us to figure out how to extend the original technology we had built around VizQL with even more new capabilities, including maps and geocoding, building statistical models, and supporting multiple data sources through blending and federation. This enabled us to leap ahead and show customers there were different and vastly improved ways of working with their data.   These early lessons in innovation still impact and inform everything we do in engineering and development at Tableau today. Early on, we learned to listen to what our customers were trying to accomplish, but we never stopped with only delivering what they asked of us. We also became customers of our own product by running our development team and the entire company on data analyzed with the product we were building. We didn’t want to miss any opportunities for improvements or just build what our customers needed right now. We wanted to reinvent how we could all work with data, then do it again and again, taking ourselves and our customers on a journey past how we were working with data today to a place we thought would be more powerful.     In addition to being our own customer and critic, we knew that as a young, small company we had to demonstrate how Tableau worked and do it fast. We did this by often demonstrating our product using data that our customers provided. This turned out to be a highly effective way to see the almost immediate impact of connecting people to the meaningful insights in their data. In fact, on one sales engagement our former CEO Christian Chabot gave a demo to about 40 people at a customer site. The demo went well, but the group was distracted. Chabot wondered what it could be and asked for feedback. He was told, rather excitedly, that the team was distracted from his demo by the insights Tableau revealed in their data. We learned early on that giving people new ways to do things opens their eyes to better ways of understanding their businesses.   Today, we continue the search for new and better ways to work with data. Whether we are helping customers analyze their data using natural language with Ask Data, or helping them surface outliers and explain specific points in data by leveraging the power of AI in Explain Data, our work in AI only continues to grow now that we’re a part of Salesforce. We recently announced that we are bringing together Tableau with Salesforce’s Einstein Analytics to deliver the best analytics platform out there. This new platform will create even more ways for people to make the most of their data, from improving the quality of insights, to helping them act faster, to enabling smarter data prep and easier sharing. This is just the beginning of our innovations to come with Salesforce as a partner.     Additionally, we are even more committed to making analytics accessible for everyone with our initiatives around becoming a data culture, where data is embedded into the identity of the organization. The World Economic Forum just released a report on the future of jobs with the main message that Covid-19 is accelerating the need for companies to scale remote work, speed up automation, and expand digitization. Old jobs will be lost and the newer ones will demand more advanced digital skills, including using data. In fact, the WEF listed the top in-demand job of the future will be for data analysts and scientists. Establishing a data culture is not an overnight process, but it’s a worthwhile and essential one and we hope our work—especially in programs to promote data literacy—can help everyone explore, understand, and communicate with data.   All these recent efforts build on what we’ve strived to do since the beginning of Tableau—give people new ways of working with their data. The original VizQL work is still the heart of our product and the work we have done since, including building new data platforms and applying good design principles to create highly engaging products. Everything we work on is to build on our mission to help people see and understand their data. We owe a great deal of thanks to the original groundbreaking work in VizQL that has truly stood the test of time.   We’re excited to continue to take that same focus, dedication, and excitement for innovation into the future. Today, as Tableau’s CTO, I’m focused on examining future technologies and product ideas that we can leverage to push our customers’ abilities to work with their data to new heights. And our R&D team remains steadfastly focused on pushing forward with new ideas and how to best turn those into the innovations that will continue to improve Tableau. If you’d like a more in-depth look at our research and development work, please follow our engineering blog. 
Read more
  • 0
  • 0
  • 1071

article-image-daily-coping-3-dec-2020-from-blog-posts-sqlservercentral
Anonymous
03 Dec 2020
2 min read
Save for later

Daily Coping 3 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
03 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to join a friend doing their hobby and find out why they love it. Joining someone else isn’t really a good idea or very possible during this time. Colorado is slightly locked down, so it’s not necessarily legal, and likely not a good idea to join someone else. However, my daughter picked up some supplies and started knitting recently. I decided to sit with her a bit and see how the new hobby is progressing. It’s something I’ve been lightly interested in, and it looks somewhat zen to sit and allow your hands to move along, building something while you sit quietly. I remember reading about Rosey Grier picking up the hobby years ago. I have done some minor paracord crafts, usually making some bag pulls for the kids I coach. This was similar, and while I don’t need another hobby now, I enjoyed watching her work. The post Daily Coping 3 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 708

article-image-sql-homework-december-2020-participate-in-the-advent-of-code-from-blog-posts-sqlservercentral
Anonymous
03 Dec 2020
2 min read
Save for later

SQL Homework – December 2020 – Participate in the Advent of Code. from Blog Posts - SQLServerCentral

Anonymous
03 Dec 2020
2 min read
Christmas. Depending on where you live it’s a big deal even if you aren’t Christian. It pervades almost every aspect of life. And this year it’s going to seep into your professional life. How? I want you to do an Advent calendar. If you’ve never heard of them an Advent calendar is a mini gift each day leading up to Christmas. I’ll be honest that’s about all I know about Advent calendars. I’m sure there’s more to it than that but this is a SQL blog not a religious one so I’m not doing a whole lot of research on the subject. So what Advent calendar has something to do with SQL? The Advent of Code! For each of the first 25 days of December there is a two part programming challenge. These challenges are Christmas themed and can be fairly amusing. They are also some of the best put together programming puzzles I’ve seen. For example for day one you were given a list of numbers. The challenge was to find the one combination where two numbers could be added together to get 2020. Then you had to return the product of those two numbers. Not overly difficult with SQL but remember that these are programming challenges. Some will favor SQL some won’t. Once you’ve input the correct answer you’ll get part two of the challenge for the day. Here’s what I want you to do. Participate in at least 10-20 days. And that’s both parts of the puzzle. If you feel like stretching yourself a bit give it a shot in multiple languages. Studying Python? Do it with both T-SQL and Python. Powershell? Give that a shot too. Each language has it’s strengths and weaknesses. Try to play to the strength of each language. The post SQL Homework – December 2020 – Participate in the Advent of Code. appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1188
Banner background image

article-image-video-azure-sql-database-import-a-database-from-blog-posts-sqlservercentral
Anonymous
03 Dec 2020
1 min read
Save for later

[Video] Azure SQL Database – Import a Database from Blog Posts - SQLServerCentral

Anonymous
03 Dec 2020
1 min read
Quick Video showing you have to use a BACPAC to “import” a database into Azure (Via Storage container), The post [Video] Azure SQL Database – Import a Database appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 825

article-image-requesting-comments-on-the-sqlorlando-operations-manual-from-blog-posts-sqlservercentral
Anonymous
02 Dec 2020
1 min read
Save for later

Requesting Comments on the SQLOrlando Operations Manual from Blog Posts - SQLServerCentral

Anonymous
02 Dec 2020
1 min read
For the past couple weeks I’ve been trying to capture a lot of ideas about how and what and why we do things in Orlando and put them into an organized format. I’m sharing here in hopes that some of you will find it useful and that some of you will have questions, comments, or suggestions that would make it better. I’ll write more about it later this week, for now I’ll let the document stand on its own, with one exception – below are a list of all the templates we have in Trello that have the details on how to do many of our recurring tasks. I’ll share all of that in the next week or so as well. SQLOrlando Operating ManualDownload The post Requesting Comments on the SQLOrlando Operations Manual appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 663
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-tableau-prep-builder-now-available-in-the-browser-from-whats-new
Anonymous
02 Dec 2020
4 min read
Save for later

Tableau Prep Builder now available in the browser from What's New

Anonymous
02 Dec 2020
4 min read
Rapinder Jawanda Spencer Czapiewski December 2, 2020 - 8:46pm December 15, 2020 With the arrival of Tableau 2020.4, we've made exciting advancements for self-service data prep. Now, you can create new Tableau Prep flows as well as edit existing flows directly in the browser. Since all your analytical work can be done conveniently in one spot on the server, web authoring helps analysts eliminates the context switching between creating in the desktop and moving the flow to the server. For IT Admins, web authoring simplifies the deployment experience and provides more visibility into the data prep process, enabling better data management. A simpler, smoother data prep experience for all Web authoring helps analysts by providing an integrated platform to work completely in the browser. You can create data sources, schedule runs, and use those data sources within their workbooks all on your server. No more costly context switching between platforms and tools—everything can now be done in one place, all from anywhere you have access to a browser.  You can create workbooks and flows directly on the web by selecting from the “New” dropdown on the Start page, Explore page, or Data Source page. We have designed the new browser experience to include autosaving as well. When you create or edit a flow in web authoring, your changes are automatically saved in a draft—no need to explicitly save your flow, and no risk of losing work in progress. You will see your changes being saved in the header. Since all your Prep work is now on the same server, everything you do in Prep web authoring is automatically compatible with your version of Tableau Server or Online. Everyone in the organization will get the latest version of Prep Builder in their browser when Server or Online is upgraded. Users only need to have a supported browser on their machine to start creating and editing flows. This means zero installs for users and less work for IT admins.  Update your flows faster Prep web authoring allows you to update flows faster because you don’t have to download the flow, open it in desktop, and then republish the updated flow. Instead, you can click on the “Edit” link and open the flow to make a change directly in the browser. Fewer overall steps and no context switching means increased productivity.  Improved data governance As an IT professional, you will have visibility into all flows that are created or being edited, giving you more control over data and resource usage. Remove unused flows or preserve resources by preventing multiple users from running the same flow. You can even put Prep web authoring on a separate node as part of your scale-out plan.  Prep web authoring allows your flows to be fully integrated with Tableau Catalog, part of our Data Management offering. You get complete visibility into the flows being created and run since all of them are now on the server rather than on individual desktops. With Catalog’s lineage and impact analysis, you can easily track the data sources being created with flows and see which workbooks are using them.  Get started today To get started with Tableau Prep on the browser, simply upgrade Tableau Server to version 2020.4, then enable flows in web authoring. For more information, read about these settings and topology changes. Now you're ready to start creating flows in the browser. Just click “New” > “Flow” on your Explore page and you can start building your flow just like in Tableau Prep Builder! Want to learn more? Interested in more details about Tableau Prep in the browser? For more information, see our Help documentation for Tableau Prep on the Web.  Eager to learn more about how to use Tableau Prep? Head over to the Tableau eLearning site and check out the Prep Builder course!
Read more
  • 0
  • 0
  • 712

article-image-ignoring-comments-in-sql-compare-from-blog-posts-sqlservercentral
Anonymous
02 Dec 2020
4 min read
Save for later

Ignoring Comments in SQL Compare from Blog Posts - SQLServerCentral

Anonymous
02 Dec 2020
4 min read
Recently I had a client that wanted to know how they could use SQL Compare to catch actual changes in their code, but not have comments show up as changes. This is fairly easy to do, and this post looks at how this works. Setting up a Scenario Let’s say I have two databases that are empty. I’ll name them Compare1 and Compare2. I’ll run this code in Compare1: CREATE TABLE MyTable (   MyKey INT NOT NULL IDENTITY(1, 1) CONSTRAINT MyTablePk PRIMARY KEY   , MyVal VARCHAR(100)); GO CREATE PROCEDURE GetMyTable @MyKey INT = NULL AS IF @MyKey IS NOT NULL     SELECT           @MyKey AS MyKey, mt.MyVal     FROM  dbo.MyTable AS mt     WHERE mt.MyKey = @MyKey; ELSE     SELECT mt.MyKey, mt.MyVal      FROM dbo.MyTable AS mt; SELECT 1 AS One; RETURN; GO I’ll run the same code in Compare2 and then run SQL Compare 14 against these two databases. As expected, I find no differences. I used the default options here, just picking the databases and running the comparison. Let’s now change some code. In Compare2, I’ll adjust the procedure code to look like this: CREATE OR ALTER PROCEDURE GetMyTable @MyKey INT = NULL AS /* Check for a parameter not passed in. If it is missing, then get all data. */ IF @MyKey IS NOT NULL     SELECT           @MyKey AS MyKey, mt.MyVal     FROM  dbo.MyTable AS mt     WHERE mt.MyKey = @MyKey; ELSE     SELECT mt.MyKey, mt.MyVal      FROM dbo.MyTable AS mt; SELECT 1 AS One; RETURN; GO I can refresh my project, and now I see there is a difference. This procedure is flagged as having 4 different lines, as you see in the image below. However, the procedure isn’t different. I’ve just added comments to one of the procs. You might view this as different, in terms of how you run software development, but to the SQL Server engine, these procs are the same. How can I avoid flagging this as a difference and causing a deployment of this code? Changing Project Options Redgate has thought of this. In the SQL Compare toolbar, there is an “Edit Project” button. If I click this, I get the dialog that normally starts SQL Compare, with my project and the databases selected. Notice that there are actually four choices at the top of this dialog, with the rightmost one being “Options”. If I click this, there are lots of options. I’ve scrolled down a bit, to the Ignore section. In here, you can see my mouse on the “Ignore comments” option. I’ll click that, click Compare Now, which then refreshes my project. Now I all objects shown as identical. However, if I expand the stored procedure object, I can still see the difference. The difference is just ignored by SQL Compare. This lets me track the differences, see them, but not have the project flag them for deployment. If I’m using any of the Redgate automation tools, the command line option for this is IgnoreComments, or icm. You can pass this into any of the tools to prevent comments from causing a deployment by themselves. This also works with inline comments. I’ll alter the procedure in Compare1 with this code: CREATE OR ALTER PROCEDURE GetMyTable @MyKey INT = NULL AS IF @MyKey IS NOT NULL     SELECT           @MyKey AS MyKey, mt.MyVal     FROM  dbo.MyTable AS mt     WHERE mt.MyKey = @MyKey;  -- parameter value filter ELSE     SELECT mt.MyKey, mt.MyVal      FROM dbo.MyTable AS mt; SELECT 1 AS One;   -- second result set. RETURN; GO The refreshed project sees the differences, but this is still seen as an identical object for the purposes of deployment. If you are refactoring code, perhaps by just adding comments or clarifying something, you often may not want a deployment triggered just from changing the notes you leave for other developers. SQL Compare can help here, as can all the Redgate tools. I would recommend this option always be set, unless you have a good reason to allow comments to trigger a deployment. Give SQL Compare a try today if you’ve never used it, and if you have it, enable this in your projects. The post Ignoring Comments in SQL Compare appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 726

article-image-pass-sqlsaturdays-i-will-be-speaking-at-from-blog-posts-sqlservercentral
Anonymous
02 Dec 2020
2 min read
Save for later

PASS SQLSaturday’s I will be speaking at from Blog Posts - SQLServerCentral

Anonymous
02 Dec 2020
2 min read
I will be speaking at two upcoming PASS SQLSaturday’s. These are free events that you can attend virtually: Azure Synapse Analytics: A Data Lakehouse 12/5/20, 1:10pm EST, PASS SQL Saturday Atlanta BI (info) Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. In this presentation, I’ll talk about the new products and features that make up Azure Synapse Analytics and how it fits in a modern data warehouse, as well as provide demonstrations. (register) (full schedule) How to build your career 12/12/20, 4:30pm EST, PASS SQL Saturday Minnesota, (info) (slides) In three years I went from a complete unknown to a popular blogger, speaker at PASS Summit, a SQL Server MVP, and then joined Microsoft.  Along the way I saw my yearly income triple.  Is it because I know some secret?  Is it because I am a genius?  No!  It is just about laying out your career path, setting goals, and doing the work. I’ll cover tips I learned over my career on everything from interviewing to building your personal brand.  I’ll discuss perm positions, consulting, contracting, working for Microsoft or partners, hot fields, in-demand skills, social media, networking, presenting, blogging, salary negotiating, dealing with recruiters, certifications, speaking at major conferences, resume tips, and keys to a high-paying career. Your first step to enhancing your career will be to attend this session! Let me be your career coach! (register) (full schedule) The post PASS SQLSaturday's I will be speaking at first appeared on James Serra's Blog. The post PASS SQLSaturday’s I will be speaking at appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 694

article-image-t-sql-tuesday-retrospective-006-what-about-blob-from-blog-posts-sqlservercentral
Anonymous
02 Dec 2020
1 min read
Save for later

T-SQL Tuesday Retrospective #006: What about blob? from Blog Posts - SQLServerCentral

Anonymous
02 Dec 2020
1 min read
I am revisiting old T-SQL Tuesday invitations from the very beginning of the project. On May 3, 2010, Michael Coles invited us to write about how we use LOB data, so now you know what this week’s post is about. Let’s go over some definitions and complain a little, because that’s how cranky data professionals-> Continue reading T-SQL Tuesday Retrospective #006: What about blob? The post T-SQL Tuesday Retrospective #006: What about blob? appeared first on Born SQL. The post T-SQL Tuesday Retrospective #006: What about blob? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 699
article-image-daily-coping-2-dec-2020-from-blog-posts-sqlservercentral
Anonymous
02 Dec 2020
1 min read
Save for later

Daily Coping 2 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
02 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to tune in to a different radio station or TV channel. I enjoy sports radio. For years while I commuted, I caught up on what was happening with the local teams in the Denver area. With the pandemic, I go fewer places, and I more rarely listen to the station. I miss that a bit, but when I tuned in online, I found some different hosts. One that I used to really enjoy listening to is Alfred Williams. He played for the Broncos, and after retirement, I enjoyed hearing him on the radio. I looked around, and found him on 850KOA. I’ve made it a point to periodically listen in the afternoon, hear something different, and enjoy Alfred’s opinions and thoughts again. The post Daily Coping 2 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 624

article-image-sql-database-corruption-how-to-investigate-root-cause-from-blog-posts-sqlservercentral
Anonymous
02 Dec 2020
5 min read
Save for later

SQL Database Corruption, how to investigate root cause? from Blog Posts - SQLServerCentral

Anonymous
02 Dec 2020
5 min read
Introduction: In this article, we will discuss the MS SQL Server database corruption.  So, first, we need to understand what the cause of corruption is. Usually, in all the scenarios of SQL Server database corruption, the main corruption cause is related to the IO subsystem level, which means that the root cause is a problem with the drives, drivers, and possibly even drivers. And while the specific root causes can vary widely (simply due to the sheer complexity involved in dealing with magnetic storage). The main thing to remember about disk systems is that any person in the IT knows that all major operating systems. It ships with the equivalent of a kind of Disk-Check utility (CHKDSK) that can scan for bad sectors, bad entries, and other storage issues that can infiltrate storage environments. Summary: If you are beginner to Microsoft SQL Server. You could do the following things to solve the database corruption. And these tricks can’t help you out: Reopen SQL Server It just holds up the issue and gives raise to the system to run through crash restoration on the databases. Not to mention, in most systems, you will not be able to do this right away and will hold up the issue further Delete all the procedure cache Separate and moving the Microsoft SQL server to a new server When you do this you will feel pain because SQL Server will fail to attach on the second server and on your primary.  At this moment you have to look into "hack attach" SQL Server and I can understand it can be a very painful experience. If you know what will be helpful to solve any problem or what can't be helpful. It requires that you have to be prepared every time for these kinds of problems.  It means that you have to create a database that is corrupt and try everything to recovery that database with the slightest data loss. You may read this: How to Reduce the Risk of SQL Database Corruption Root cause analysis: Root cause analysis may be a crucial part of this method and should not be unmarked regardless of however you pass through the info. This can be a vital step in preventing the matter from occurring once more and doubtless earlier than you're thinking that. In my expertise, once corruption happens, it's absolute to happen once more if no actions area unit is taken to rectify the matter. To boot, this is often seemed to be worse the second time. Now, I'd counsel, that though you think that you recognize the explanation for the corruption (E.G. power outage with no UPS) investigate the subsequent sources anyways. Perhaps the outage was simply helped and there have been warning signs occurring. To begin, I perpetually recommend these places to seem. Memory and disk medicine to create certain there aren't any issues with the present hardware SQL Server error logs Windows event viewer While rare, sit down with your vendors to examine if they need to have issues with the computer code you're using Software errors, believe it or not, Microsoft has been known to cause corruption. See KB2969896. this is often wherever gap tickets with Microsoft also are helpful The event viewer and SQL server error logs may be viewed along. But, I suggest dividing these out to the system administrators as they regularly have more manpower on their team to review these. Helpful Tip: In fact, even once knowing what the matter is, I forever counsel gap a price tag with Microsoft as a result of they're going to not solely provide an additional set of eyes on the problem however additionally their experience on the topic.to boot, Microsoft will and can assist you with the next steps to assist notice the foundation reason behind the matter and wherever the corruption originated from. Corruption problems: If the database is corrupt, it is possible to repair the database using SQL Recovery Software. This software will allow repairing the database in case of corruption. Conclusion: So finally, after this article, we learn many things about database corruption and how to resolve that corrupt database. Most of the things are too common, and now you can solve this kind of common corruption. With time when will you finish this series, the goal will be that when you find out you have corruption, it is coming from your alerts, not an end-user, and you will have a procedure to let your managers know where you sit and what the next steps are. Because of this, you will get a lot of benefits, and also it allows you to work without having someone breathing down your neck frequently. www.PracticalSqlDba.com The post SQL Database Corruption, how to investigate root cause? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 956

article-image-world-statistics-day-the-search-and-need-for-trusted-data-from-whats-new
Anonymous
02 Dec 2020
5 min read
Save for later

World Statistics Day: The search and need for trusted data from What's New

Anonymous
02 Dec 2020
5 min read
Andy Cotgreave Technical Evangelist Director, Tableau Kristin Adderson December 2, 2020 - 1:46am December 2, 2020 Editor's note: A version of this article originally appeared in Information Age.   This year’s UN World Statistics Day theme of “connecting the world with data we can trust” feels particularly timely. The global pandemic has put data at the heart of how the public is informed and persuaded to change behaviors. There has been a huge learning curve for the general public and for governments, with many new public-health statistical systems being built from scratch in country after country, globally. Even though data has become more influential in our lives, people’s level of confidence in using and asking questions of data hasn’t increased. Simply being presented with statistical charts of the pandemic hasn’t made us all more data literate. If the handling and presenting of data during the pandemic has shown us anything, it’s that public citizens, politicians, and the media all need to commit to knowing and interrogating data. This will be even more relevant as the second COVID-19 infection wave affects our economies and we look for signs in the data that the pandemic may be receding. In the spirit of World Statistics Day, what more can governments be doing to improve how they use and present data to the public? Should citizens themselves be responsible for making sure they understand data being presented to them, so they can form objective opinions? What is data without trust? Those in positions of responsibility are facing major challenges when it comes to trusted data use—as the current pandemic shows how important data is for society, politics, and companies. Transparency is vital. This situation also shows that the understanding of data, and related analyses, is not obvious. Do consumers of the insights know where the data comes from, or how it was modeled? Is it clear where there is uncertainty in the underlying data? Is the source data available for others to interrogate? Think back to the “flatten the curve” charts that taught us so much at the start of this pandemic. The images presented two possible outcomes, based on different levels of lockdowns. This type of chart was easy to understand, and they were accompanied by detailed data stories explaining how they worked. By not overcomplicating the narrative, local politicians and media outlets were able to clearly communicate their key messages to the public and, through that clarity and openness, were able to establish a level of trust. As we all came to terms with the new disease, the data—having been presented so well—helped people change their behaviors. Data is at the crux of the decision-making process Over time, as pandemic fatigue has set in, and the reality that science and statistics are uncertain, people have become less trusting of the data presented to them. First off, an inherent problem is that people believe data contains more “truths” about what has happened. This is a fallacy. For a start, all data is messy—even in robust systems. Furthermore, charts are not neutral. Imagine a chart showing an “average number of COVID-19 cases”; I could choose the mean, median, or mode. Or I could choose a 7- or 14-day moving average, with each chart telling a different story. We also read charts as we read an opinion piece in the pages of a newspaper: our own biases affect how we interpret the data. Just like the audience’s own biases, their level of data literacy also impacts the interpretation. All of this is mitigated if data sources are open, skills are always being developed, and a culture of conversation is encouraged. Data literacy should be a core competency Even before the pandemic, it was clear that national data literacy levels should be raised significantly. But this year, COVID-19 has highlighted the ever-present challenge of data literacy both within the wider population, and at the top levels of government and policy-making. At its most basic level, data literacy is the ability to explore, understand, and communicate with data. But in order for a data-led strategy and approach to work effectively on a large scale, more effort needs to be put into considering how to build a Data Culture. Specifically, one that encourages answers and interrogations to a series of fruitful questions about data in society and business. A significant part of the challenge facing government and businesses is to shatter the inscrutability around data, and instill data literacy as a core competency across a far broader cross-section of the workforce. I challenge the government and businesses to do better at making data literacy, and the skills required, both accessible and a priority. By doing this, we will then begin to build a society that is more inclusive, trustworthy, and collaborative with data—ultimately connecting the world through data that we can trust.
Read more
  • 0
  • 0
  • 732
article-image-would-you-pass-the-sql-server-certifications-please-what-do-you-mean-were-out-from-blog-posts-sqlservercentral
Anonymous
01 Dec 2020
5 min read
Save for later

Would You Pass the SQL Server Certifications Please? What Do You Mean We're Out? from Blog Posts - SQLServerCentral

Anonymous
01 Dec 2020
5 min read
I have held various certifications through my DBA career, from CompTIA A+ certification back when I worked help desk (I'm old) through the various MCxx that Microsoft has offered over the years (although I never went for Microsoft Certified Master (MCM), which I still regret). I have definitely gotten some mileage out of my certs over the years, getting an interview or an offer not just because I was certified, but rather because I had comparable job experience to someone else *and* I was certified, nudging me past the other candidate. I am currently an MCSA: SQL 2016 Database Administration and an MCSE: Data Management and Analytics, which is pretty much the top of SQL Server certifications currently available. I also work for a company that is a Microsoft partner (and have previously worked for other Microsoft partners) and part of the requirements to become (and stay) a Microsoft partner is maintaining a certain number of employees certified at certain levels of certification dependent on your partnership level. I completed the MCSE back in 2019, and my company is starting to have a new re-focus on certifications (a pivot, so to speak - I hate that term but it is accurate), so I went out to look at what my options were.  We have two SQL Server versions past SQL Server 2016 at this point, so there must be something else right? On top of that, the MCSA and MCSE certs I currently have are marked to expire *next* month (January 2021 - seriously, check it out HERE)...so there *MUST* be something else right - something to replace it with or to upgrade to? I went to check the official Microsoft certifications site (https://docs.microsoft.com/en-us/learn/certifications/browse/?products=sql-server&resource_type=certification) and found that the only SQL Server-relevant certification beyond the MCSE: Data Management and Analytics is the relatively new "Microsoft Certified: Azure Database Administrator Associate" certification (https://docs.microsoft.com/en-us/learn/certifications/azure-database-administrator-associate).   The official description of this certification is as follows: The Azure Database Administrator implements and manages the operational aspects of cloud-native and hybrid data platform solutions built with Microsoft SQL Server and Microsoft Azure Data Services. The Azure Database Administrator uses a variety of methods and tools to perform day-to-day operations, including applying knowledge of using T-SQL for administrative management purposes. Cloud...Cloud, Cloud...Cloud...(SQL)...Cloud, Cloud, Cloud...by the way, SQL. Microsoft has been driving toward the cloud for a very long time - everything is "Cloud First" (developed in Azure before being retrofit into on-premises products, and the company definitely tries to steer as much into the cloud as it can. I realize this is Microsoft's reality, and I have had some useful experiences using the cloud for Azure VM's and Azure SQL Database over the years...but... There is still an awful lot of the world running on physical machines - either directly or via a certain virtualization platform that starts with VM and rhymes with everywhere. As such, I can't believe Microsoft has bailed on actual SQL Server certifications...but it sure looks that way.  Maybe something shiny and new will come out of this; maybe there will be a new better, stronger, faster SQL Server certification in the near future - but the current lack of open discussion doesn't inspire hope. -- Looking at the Azure Database Administrator Associate certification, it requires a single exam (DP-300 https://docs.microsoft.com/en-us/learn/certifications/exams/dp-300) and is apparently "Associate" level.  Since the styling of certs is apparently changing (after all it isn't the MCxx Azure Database Administrator) I went to look at what Associate meant. Apparently there are Fundamental, Associate, and Expert level certifications in the new role-based certification setup, and there are currently only Expert-level certs for a handful of technologies, most of them Office and 365-related technologies. This means that for most system administrators - database and otherwise - there is nowhere to go beyond the "Associate" level - you can dabble in different technologies, but no way to be certified as an "Expert" by Microsoft in SQL Server, cloud or otherwise. (The one exception I could find for any sysadmins is the "Microsoft Certified: Azure Solutions Architect Expert" certification, which is all-around design and implement in Azure at a much broader level.) -- After reviewing all of this, I am already preparing for the Azure Database Administrator Associate certification via the DP-300 exam, and I am considering other options for broadening my experience, including Azure administrator certs and AWS administrator certs.  I will likely focus on Azure since my current role has more Azure exposure than AWS (although maybe that is a reason to go towards AWS and broaden my field...hmm...) If anything changes in the SQL Server cert world - some cool new "OMG we forgot we don't have a new SQL Server certification - here you go" announcement - I will let you know. The post Would You Pass the SQL Server Certifications Please? What Do You Mean We're Out? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1392

article-image-pass-summit-2020-mental-health-presentation-eval-comments-from-blog-posts-sqlservercentral
Anonymous
01 Dec 2020
5 min read
Save for later

PASS Summit 2020 Mental Health Presentation Eval Comments from Blog Posts - SQLServerCentral

Anonymous
01 Dec 2020
5 min read
I first starting presenting on mental health last December in Charlotte, NC on mental health.  I got some backlash at a couple of events for a few people on “me” being the person talking about it.  But I’ve gotten an overwhelming amount more of support than I have backlash.  I just want to share the comments I got from the 20 people of filled out evals at Summit.  If you notice one person actually used something they learned with a family member that week.  I’m not going to worry about revealing scores mine was the highest I’ve ever gotten, but if someone could please fix that darn vendor related worded question so I can quit getting my score lowered while not advertising anything it would be great. I would ask any managers out there that are reading this that have had to feal with employees with mental health issues to contact me.  I do get questions on the best way to approach an employee they are concerned about and have given advice and how I would like to be approached but I liked to hear how it looks from a manager’s perspective.  Just DM me on Twitter.  I don’t any details on the person or particular situation just how you approached the person with the issue. These are all the comments unedited I received: This is important stuff to keep in mind both for oneself and when working with and watching out for others, personally and professionally. Thank you. I’m happy that people are starting to become more comfortable sharing their battles with mental illness and depression. Tracy eloquently shared her story in a way that makes us want to take this session back to spread to our colleagues. There definitely needs to be an end to the stigma surrounding mental health; sessions like Tracy’s are helping crack that barrier. VERY valuable session. Thanks! Thanks Tracy! That was a wonderful session and thank you for discussing the elephant in the room as the saying goes. I didn’t realize there are higher rates of mental health issues for us IT folks. I’ve also struggled with co-workers that didn’t understand and were not compassionate about what I was going through at the time, which made things harder. Thanks again! Rich This is a great session. It is good to remind ourselves that we are all human and need to focus on our mental health. Also I have known Tracy for awhile and I know that she is super talented and does so much to give back to not only PASS but other great causes too. Hearing about some of the challenges she has had helps to demonstrate that we are all more a like than we are different in that we all struggle with things from time to time. Also great use of pictures in the session. Having relevant pictures through out made the presentation speak louder for sure. Thanks for sharing your story, Tracy! valuable topic I admire Tracy’s strength for talking about what she has been through. Hopefully it opens the door for others to be able to speak more openly in the future. as far as the presentation itself, the slides were good and gave a good summary of the discussion. Thank you for speaking about this. It’s good to hear that we’re not alone in feeling stress. The list of resources in the slides is of great help. I really wish you had done a session like this with a health professional. It was okay to hear first hand experience but I think that insight from a mental health professional would have been much more helpful. It takes a lot of courage to approach and discuss this topic. This was a very good reminder to me to stop and remember it’s not all about deadlines etc. Some the statistics were very eye opening. I’ve been impacted by several suicides over the last five years and it is hard to understand and to understand how to help. It’s good to be reminded that just listening helps. Tracy is exceptionally brave; I appreciate her work to destigmatize the topic and provide practical and tangible advice. Much appreciated. Thanks Tracy. I was able to use some of the things you taught me to work through a mental health issue in my family yesterday and the results were excellent. Keep sharing! Thank you so much. Thank you for sharing your story and helping me realize how many people struggle with mental health in IT. Thank you for the pointers on how to help a friend. Thank you for the survival tips. This was the most valuable session of the whole conference for me! The post PASS Summit 2020 Mental Health Presentation Eval Comments first appeared on Tracy Boggiano's SQL Server Blog. The post PASS Summit 2020 Mental Health Presentation Eval Comments appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 749