Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

Tech News - Databases

233 Articles
article-image-splitting-up-the-mission-of-pass-from-blog-posts-sqlservercentral
Anonymous
28 Dec 2020
5 min read
Save for later

Splitting up the Mission of PASS from Blog Posts - SQLServerCentral

Anonymous
28 Dec 2020
5 min read
Following up on Should There Be A Successor to PASS? I have a couple more thoughts. One of the many complaints about PASS over the years was about perceived value. Most everyone saw the value of the Summit but after that it was not a simple conversation to convince someone in the audience about the value of PASS in a way that really mattered to that person. I’ve seen some make the case better than others, but it wasn’t easy. At the same time, in many ways the only thing that mattered to PASS was whether groups or events drove registration to the Summit. I think the two of those ideas speak a bit to the dissonance felt by many about an organization that was in name at least an organization for professionals. My intent isn’t to beat on PASS, but to recognize that if we were to magically rehydrate it today, we’d have all the same old problems and pains. If we want to do better, we have to think about those pain points and I think that in turn leads us to think about what problem(s) we want to solve. For example, take user groups. If you were building designing a brand new non profit to help user groups, what would be the mission? Is it to serve all those local communities? Or is it to serve the group leader and key volunteers? PASS mattered a lot to group leaders because: Instant network and identity, access to some getting started resources Free solution for hosting the group, gathering a mailing list, and to some degree making it findable A incentive in the form of a free registration to the Summit But besides connecting Joe or Jane Attendee to PASS which might yield some future benefit, attendees at a user group saw the user group as the thing that was delivering value. I don’t see anything wrong with that, at all, but it depends on what you think that non profit in the distance is supposed to be doing. I think it serves local communities indirectly. You could say that SQLSaturday is very similar, with perhaps a better set of tools. It provides an event framework, some very light guard rails, and even some seed money and uses that to inspire and empower that one local volunteer that will make the magic happen. At the end of a SQLSaturday it’s only right that the cheers and thanks go to the organizers and speakers and other volunteers. It’s not that what PASS provided had no value, but trying to get credit for it or monetize that credit in the form of getting paid Summit registrations while not evil was a distraction from the main mission of doing good locally. A true professional association might well combine all those things, as PASS attempted, but the problem is giving each segment enough time and focus. It’s not impossible, it’s just hard. Instead, what if we built them as separate orgs, each responsible for defining a mission and a way to measure success and raising funds to enable that mission? Here’s my idea list (with generic names for now): SQLGroups. Exists to grow sql user groups and user group attendance. How we do that, well, that’s the story to write isn’t it? This is maybe 200 group leaders and a few hundred volunteers, how would it serve them? SQLSaturday. Pretty much just like it was, with perhaps one or two fewer rules! Maybe we make the framework more generic so that we could host other events or on differerent days, maybe it’s open source, or maybe it’s something that should serve a wider community than just SQL. Code Camps? What do we do if someone wants to do something along the lines of Rally or Bits? Does that fit here? SQLFamily. Why not? I’d appreciate a place where I could get news about personal events, lightly curated SQLAssociation. Building on the very raw beginnings of PASS Pro, this could be the list you join to get once a month news about SQL, links to other resources (free and paid), a market place for tools and training, maybe a way to track education credits for classes. I think this is perhaps the true replacement for PASS, with a fraction of the mission (and budget) SQLSpeakers. Why not a site/org that is just for speakers, experience or starting out? What could that provide? SQLBloggers. Much like speakers, what could this do? We could at least resurrect the blogger of the year contest. As soon as you start thinking about those, it can be exciting and confusing. Why not put groups and SQLSaturday together? Idk, maybe that is the right thing to do! Not everything needs to be free, or not for profit, or community owned. The stuff about may not be the right list, it’s certainly not a complete list. It’s a variation of my five hats theory, which is about finding and taking care of various parts of your audience. You probably noticed that the Summit isn’t on my list. Huge events are a lot of work and a lot of risk and I think better left to for profit enterprises. Maybe we’d pick an “official” event each year, or find a trusted partner. If a Summit replacement springs up we can hope that it will also try to do some good in the form of providing some grants to whatever org(s) we charter, if any. Nothing about that should preclude us from encouraging and building regional events at a lower price point. What good will we do and how do we pay for it? If we can answer those, then we can figure out an org and governance model, if one is needed at all. The post Splitting up the Mission of PASS appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 2362

article-image-building-a-sql-saturday-archive-from-blog-posts-sqlservercentral
Anonymous
28 Dec 2020
6 min read
Save for later

Building a SQL Saturday Archive from Blog Posts - SQLServerCentral

Anonymous
28 Dec 2020
6 min read
Some time ago I started gathering the SQL Saturday XML files. I started to parse and work with these, but the data is a mess and it never was that important. I regret that now. In any case, with the announcement of insolvency from PASS recently, I asked people to save some data. One of those that responded was Ben Weissman (b|t), and he actually rendered out all the available schedules as PDFs (using VB, of course). He sent them to me and I decided to load them up. Tl;dr; You can see my page here: https://dataplatformdaysweb.azurewebsites.net/ I had started an initial project for replacing SQL Saturday, but hadn’t planned on anything more than a static site. Actually, I wanted a sortable grid, but that was beyond the time and web skills I had at this time. That’s still a goal. This is a quick look at how I built things. I am not a real software developer, at least not on the current, interactive web. This stuff likely makes sense to any junior web dev, but it was learning for me. Azure DevOps and Dot Net Core I wanted a simple site, but I wanted this built in Azure DevOps, so it has some control and tracking. I thought about a simple HTML site, but producing a build with that didn’t seem intuitive to me, so I fired up Visual Studio. I chose a Dot Net Core ASP.NET REACT application, as I may move this to Linux. It’s cheaper In any  case, I took the defaults. No real reason other than I’ve tried MVC and that was hard, and lots of people seem to like react. I also have people I can bug at Redgate. I got the default project to build locally. Then I changed the names of the pages and loaded this into an Azure DevOp repo. Once up there, I took a default build process. I pointed this at my repo and then clicked Save and Queue… and it failed. Changes to the Build I got a message that the nuget restore wouldn’t work with dotnet core 3.1. I could fall back to 2.2, but when I did that, the project wouldn’t build locally. I realized I’d initially selected a Windows VS-2016 hosted agent, but I had built the project on VS2019. I changed that to the Windows 2019 agent and it worked. Deployment to Azure I’d set up an Azure App Service already, and I created a very simple release step. I linked my artifact and selected a release to an Azure App Service Plan. I had to authorize my plan, but once I did that, I was able to select the App Service I’d set up. No configuration needed. I clicked save, built a release, and ran. I found the default React Site at my URL. Changes to the Project I made a few changes to the project as well, to remove some of the defaults. First, I needed to load my PDFs into the project. I had originally created an Assets folder in the root of the project, but that did not get included in the artifact that was built. Looking at the project, and searching around Google a bit, led me to see that the main page, index.html, was in the ClientApp/public folder. I moved my Assets folder below this, and then saw all my files included in the build artifact and deployed. I also wanted to remove some of the default sample menu items. I found these in the ClientApp/src/components folder in the NavMenu.js. I deleted the two entries, leaving just a “home” there for now. I may do some other grouping later. Building the Archive This was the more interesting item for me. Ben had sent me a ZIP file with all the PDF files in it. I unzipped these and I saw this view: Originally I thought a simple list of numbers and files would get me started, but there are hundreds of files. How can I do this? My first thought as PowerShell can help. I popped this open and use Get-ChildItem to get a list of files and compile this into a variable. I have been wanting to use Azure Data Studio more for PoSh, and that’s where I did this. This got me a basic HTML list of my files. I had trouble with the pathing, so rather than futz around and try to build production code here, I just used this and then a “search and replace” of the [a href=”] to add a [a href=”/Assets/PDF/”] got me the correct paths. I need to learn how to properly get paths working here in PoSh, but this string manipulation wasn’t important for a one off task. Once I had this, I had something. Of course, at this point, Ben sent me his index list of the event names, which was what I really wanted. I could have taken the source of his page and used search and replace to get the pathing, but I did something stupider. In a hurry, I copied and pasted his list of events into SSMS in a new Query Window. One of the reasons I do this is that the cross line editing is superior (IMHO) to VS and VSCode. I’ll repeat the process with just a few lines here, but keep in mind I had like 800. This is a useful text trick as well for some data changing. I had this list: I wanted to make this a table, so I use the Select+Alt+Arrows to select the entire first column. I then added my table HTML. I could do this in VSCode, but the reason I like SSMS is that I can space over to the right and then get a vertical edit link, rather than a set of end-of-line cursors. I then can create another edit point and add other code, like this: I wrapped this in the table beginning and ending and had my table. What about the URLS? Well, I could easily add the paths, but then getting the individual file names was hard. Or was it? I used the same trick. I pasted my list code into SSMS and selected all the file names: I copied and pasted this set of vertical text into my table, and viola, I had a working table that looked, well, about as good as I could make it quickly. More to come, as I try to archive and preserve the SQL Saturday data and history as best I can. The post Building a SQL Saturday Archive appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 2386

article-image-daily-coping-28-dec-2020-from-blog-posts-sqlservercentral
Anonymous
28 Dec 2020
1 min read
Save for later

Daily Coping 28 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
28 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to contact someone who many be alone for feeling isolated. I call my Mom every week or so, knowing she’s alone. She’s content, but I do know that life is better with people in it. However, I know my brother and other family talk with her, so I’m not too concerned. I do have other friends that I know are without a partner in life, either young or older, and I decided to reach out recently when I typed this post. In this case, I have a friend I know that lives alone. This person has family, but I don’t know how often they see anyone in person or have a conversation. I sent a message, and then had a conversation back and forth, just catching up. It made my day. Hopefully they can say the same. The post Daily Coping 28 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1085
Banner background image

article-image-simple-method-to-add-a-network-in-sql-configuration-manager-from-blog-posts-sqlservercentral
Anonymous
28 Dec 2020
8 min read
Save for later

Simple Method to Add A Network in SQL Configuration Manager from Blog Posts - SQLServerCentral

Anonymous
28 Dec 2020
8 min read
In the previous article, I showed a bothersome little error for the Kerberos Configuration Manager software. In that article, I also showed how to resolve that error. This article has nothing to do with that previous article beyond the similarity in tool names (“Configuration Manager” being central to both). In this article, I will show you how to add or rename a network adapter to be used by SQL Server. You are probably thinking “What’s the big deal? SQL Server does that automatically.” Well, for the most part SQL Server does automatically add your network adapters for you. I would guess the success rate is somewhere in the realm of 99.999% of the time. However, that is not 100% of the time and it just so happens I have run into a couple of occasions where the NIC was not properly presented to SQL Server for use by SQL Server or even configuration by the DBA. This presents a bit of a problem! Add a Network Adapter to SQL Server Configuration Manager Before diving into adding a network adapter, let’s take a look at what the network properties pages might look like in SQL Server Configuration Manager. Many of you are likely very familiar with this already. If you are not, you should be! In SQL Server Configuration Manager, expand “SQL Server Network Configuration, then click on the Instance for which you are interested. Note here that I am showing a multi-instance server. Adding multiple network adapters is a configuration method that I frequently use when dealing with a multi-instance server. There are several benefits to having multiple network adapters such as: 1) being able to have each instance listen on the same port – but different adapters, 2) being able to add an A record in DNS for each instance, and 3) adding a layer of security by obfuscation  (e.g. developers don’t know the instance name – just an alias for it). In the case where there will be multiple Network Adapters presented to SQL Server, the first thing to do is to disable “Listen All” in the case where multiple Instances exist on the server and you are looking for a more advanced setup. With that configuration set, the next page you should be familiar with is the “IP Addresses” page. This is the page where the network adapters should be found. The preceding image is a very busy image. There are three network adapters on this particular server. Each network adapter is assigned to a different IP address and each has a different port. In this case, which happens to be a multi-Instance server, not all of the assigned adapters are listening on the specified port. This is a desired configuration when everything is working well. This is where the problem that is the basis for this article comes into play – what happens when the network adapters do not automatically show up in SQL Server Configuration Manager (SSCM)? Alternatively, if the adapters are all showing, how do I keep them all straight in SSCM so I can make sure the correct adapter is enabled/disabled for the instance in question? Let’s add that Network Adapter Now SQL Server should detect a new network adapter when it is added to windows. Sometimes, it takes a server restart. Sometimes it never shows up. And sometimes it shows up but you want to rename it. Some will say to just run a repair on your instance if it is not showing the network adapter changes. After a recent experience with that, I say don’t bother with the repair! In a recent adventure with this problem, I had two adapters presented to the server prior to installing SQL Server and only one of the adapters was recognized by SQL Server. Clearly, the attempt after that would have been fruitless because the setup didn’t find the second adapter. Additionally, the running of repair on the server could cause more harm than it might fix. So, if we don’t want to run repair, then what should we do? Regedit to the rescue! Oh no, not that! Isn’t that risky as well? Truth be told, editing the registry can cause damage if you are careless, reckless, and maybe in a bad mood. Editing the registry is no more dangerous than altering data in a database. Just take a few precautions if you are worried. Some precautions might include taking a backup of the registry or maybe a system state backup of the server. These are the same sorts of precautions a good DBA would take when altering data in a database (backup the data, backup the server etc). Let’s see what hacking the registry would entail. First and foremost, the path we are looking for in the registry is a bit long and hairy and you must pay particular attention here. [HKEY_LOCAL_MACHINESOFTWAREMicrosoftMicrosoft SQL ServerMSSQLXX.[YourInstance Identifier]MSSQLServerSuperSocketNetLibTcp] In the preceding image, my path is (circled in red at the top of the image) [HKEY_LOCAL_MACHINESOFTWAREMicrosoftMicrosoft SQL ServerMSSQL14.DIXSEPTLATIN1MSSQLServerSuperSocketNetLibTcp] where my version of SQL is 14 (replacing the XX) and my instance identifier is DIXSEPTLATIN1 (replacing “[YourInsance Identifier]“). Once Tcp is expanded, you will see all of your adapters that are usable in SSCM. In order to change any of these adapters (or to add a new one), the first step is to export the key for either the adapter to be changed or for the adapter to be used as a model in order to create the missing adapter. As noted in the image, you will export the key to a reg file. Everything in IP2 shown in the image will then be exported and saved. It is this exported reg file that we will edit in order to create a new adapter (or rename an adapter). Here is an example of an edited reg file that is ready for import into the registry. In this example I have only made a couple of changes for this demo adapter. In this case, I named the adapter to match the instance name for which it will be assigned and “listening”. Then I proceeded to modify the IPAdress (in red) to the appropriate IP address that has been reserved for that network adapter. You will need to modify each of these settings as best suits your needs. I like to name the adapters to match the instance because then it becomes very easy to keep straight and troubleshoot in the future. After making the adjustments in the file, then it is time to save the file and “import” it into the registry. This step is easy – just double click the file from explorer and you will be prompted with some warnings about writing data to the registry. Once imported, I see something like this in my registry. See how that adapter has been added in the registry? This is precisely what we want. Now when we look at the network properties pages in SSCM for this Instance (because this is Instance specific), we will see a graphical representation of that adapter that we can edit. One major caveat illustrated in this image of the IP Addresses page in SSCM is that the adapter name is different than what I showed in the reg file. This was done to help illustrate a requirement here. If the adapter name does not begin with “IP” then SSCM will not pick up the new adapter. You must name your adapters with an IP in front for it to be usable in SSCM. Now that you have more adapters added and named to your liking, you are set for a more custom setup for SQL Server that allows you greater security and flexibility. Put a bow on it In this article I have demonstrated a fairly simple method to help you modify your network adapters that are presented to SQL Server. Being able to modify these adapters is essential in multiple different scenarios such as the adapter is just plain missing or you want to rename the adapters in order to more easily manage the server in SSCM. Interested in more back to basics articles? Check these out! Want to learn more about your indexes? Try this index maintenance article or this index size article. This is the second article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post Simple Method to Add A Network in SQL Configuration Manager first appeared on SQL RNNR. Related Posts: How To Resolve User Error in Kerberos Configuration Manager December 26, 2020 Configuration Manager is Corrupt December 17, 2019 CRM Data Source Connection Error January 23, 2020 The Gift of the SPN December 10, 2019 Changing Default Logs Directory - Back to Basics January 4, 2018 The post Simple Method to Add A Network in SQL Configuration Manager appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1043

article-image-how-to-resolve-user-error-in-kerberos-configuration-manager-from-blog-posts-sqlservercentral
Anonymous
26 Dec 2020
6 min read
Save for later

How To Resolve User Error in Kerberos Configuration Manager from Blog Posts - SQLServerCentral

Anonymous
26 Dec 2020
6 min read
A recurring theme in the world of SQL Server seems to be the battle with Kerberos, SPNs, and SSPI Context errors. It is common enough that many DBAs have gone bald along with their domain admin counterparts trying to fix the problems. Once upon a time, I contributed an article showing a decent tool that can help figure out some of the problems related to SPNs, SSPI errors, and Kerberos in general – with regards to SQL Server. The tool I mentioned in that article is called “Kerberos Configuration Manager” (KCM). Recently, I ran into an error with this tool that is a bit absurd and not very helpful at all. Given the usefulness of the error and absurdity of it, I thought to myself – why not share the problem and resolution in this, the first article in this year’s 12 Days of Christmas Series. What A Wretched Little Error On the surface this seems like a pretty chill error. It seems that there is useful information in the error screen that could help one go and figure out what to do next. Looking at the error text, it appears that I have an issue with permissions because I can’t access UserAccount information and the error text gives me a log to go check. Let’s break this down a little bit. This error pops up with a user that happens to be a member of the local admins group on the server in question. The user also happens to be a domain administrator. And <grimace>, this user is also a sysadmin in SQL Server. So, seemingly permissions should not be an issue, right? I know, I know. This is not an ideal security setup. It just so happens to be a point of interest currently being discussed and worked on with this client. The security setup will get better. That said, I would eliminate permissions as a variable, and therefor permissions would not be a cause in this error. Let’s take a look at the next given (sorry mathematical proof concept shining through there), aka “bit of information from the error text”. The error text tells me there is a log available and gives me the directory where it should exist, so it is time to look at that log. If I proceed to open that file and look at the contents, I frequently see something like this ( 4 for 4 with this particular client). Note here that the file is entirely empty. This is a problem! The place I am supposed to look to resolve this problem has nothing logged to the file. How can I possibly use that to troubleshoot? Well, keep reading. The Cause and How to Fix this Kerberos Config Mgr Error The error message is certainly misleading. Then again, maybe it isn’t. As it turns out, the cause of the message is due to the existence of a ghosted AD account in the local admins group. Here is an example of what I am talking about in the image to the right. The local admins group on each of the affected systems had at least one of these detached SIDs. These are accounts that basically don’t exist any longer in Active Directory. These accounts should be cleaned out on a regular basis and it is a fairly risk free process. Given this bit of insight, if you re-examine the error text, it now makes sense. There is an account for which the tool cannot gain access because the account does not truly exist – just some shards of it. To fix the error, just delete these SIDs from the Local Admins group and then run the KCM tool again. After the ghost SIDs are removed, then an interesting thing happens (besides the KCM working properly). When you open the log file again, you will see something different. Here is an example. Looking closer at the text of the log, this is the line of the greatest interest: Error: Access of system information failed System.DirectoryServices.AccountManagement.PrincipalOperationException: An error (1332) occurred while enumerating the group membership. The member’s SID could not be resolved. Clearly, if this message had populated before the problem was fixed, then I would have been able to fix the problem in a more direct path. This clearly states that there is a problem with an SID and that the SID could not be resolved. Why the tool needs to be able to resolve all SIDs escapes me, but it is what it is and we just roll with it for now. Put a bow on it This article showed a problem with one of my favorite tools – Kerberos Configuration Manager. This tool does provide a great deal of power in helping to resolve various SPN related problems with your SQL Server instances. Sadly, the error in this case is a bit of a pain to figure out because the log doesn’t populate properly when the error is thrown. Rather the log seems to populate after the error is resolved. The solution provided in this article is an easy fix and is consistent across multiple versions of Windows and SQL Server. Save yourself some headache up front, just delete those phantom SIDs from the local admin group on a routine basis. They shouldn’t be there anyway. Interested in more back to basics articles? Check these out! Want to learn more about your security? Try this article on some of the many problems with lazy security in your environment (yes akin to a user being a Domain Admin, Sysadmin, and local admin). Here is another fantastic article discussing some of the persistent issues I have seen across the board at numerous clients for years and years. And as a prelude to an upcoming article in the 12 Days of Christmas series, here is a refresher on a topic I wrote about just last month. This is the first article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post How To Resolve User Error in Kerberos Configuration Manager first appeared on SQL RNNR. Related Posts: The Gift of the SPN December 10, 2019 CRM Data Source Connection Error January 23, 2020 Collation Conflict with Extended Events March 12, 2020 Configuration Manager is Corrupt December 17, 2019 Ad Hoc Queries Disabled December 20, 2019 The post How To Resolve User Error in Kerberos Configuration Manager appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 2388

article-image-should-there-be-a-successor-to-pass-from-blog-posts-sqlservercentral
Anonymous
26 Dec 2020
4 min read
Save for later

Should There Be A Successor to PASS? from Blog Posts - SQLServerCentral

Anonymous
26 Dec 2020
4 min read
PASS was a big influence on a lot of us and did a lot of good, if never quite as much good as many of us wished. I wish PASS had survived, but it didn’t, and now we’re at a crossroads for what comes next. We’ve got short term challenges as far as supporting events that are coming up in the next few months and getting groups moved to an alternative of some sort, but beyond that, we have to decide if we want a successor to PASS or not. I think to answer that, it depends on what we want the new org to do. What would that new mission statement be and can all (most) of us agree on it? Even before we get into funding and a governance model, what can/could a new org do that we care about? My short answer is that a new org should do all the stuff that doesn’t make much money, if any. I think it would exist to facilitate career growth in areas not served (or served well) by for profit businesses. I think it could be an org we see as the glue without being in control. I think it probably doesn’t include a Summit class event because it just over shadows everything else. I think it could help facilitate regional events via grants and experienced volunteers. I think it can’t be all things to all people, but it could be some thing to many people. Back in 1998 defining it the focus as SQL Server was an obvious move. Today, there’s still plenty of people that use SQL, but there is lots of other stuff going on and figuring out where to draw the line is important, because that mission statement helps you evaluate everything you do or don’t do. Microsoft Data Platform excludes a lot of cool stuff. Focusing on Azure tends to ignore the world of on premise, AWS, and Google. But…it depends on what you want to accomplish, doesn’t it? Is it to be a professional association? To make money? To do good at a larger scale than a single product or profession? Or to narrow the focus, perhaps on day long events or SQL DBA’s or growing speakers or whatever. I made a small wish list (and surely I could add another 100 lines to this!): A real non-profit, with a sturdy and clear charter that includes a commitment to transparency, and one that owns all the intellectual property we choose to put into it (for example, SQLSaturday.com if we can get it) A plan for raising the minimal amount of funds needed for things like owning a domain, hosting event sites, etc, and building a multi year reserve No full time staff and limited outsourcing on a project basis, with all the day to day stuff automated or handled by volunteers Vendor agnostic, vendor independent, but one that recognizes the importance of vendors in our work and our community. A solid way of deciding who can be a voting member (one person=one vote) and who can join the Board An org that we’ll be proud of and hold up as a best in class example of how to build a technical professional association. As few rules as possible To answer the question I posed in the title, I haven’t decided yet (though I started out two weeks ago thinking “yes”). I don’t know if its possible or practical to have a single successor org to PASS. I’m still thinking about it, and waiting to see what ideas bubble up over the next couple of months. The post Should There Be A Successor to PASS? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1201
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-merry-christmas-from-blog-posts-sqlservercentral
Anonymous
25 Dec 2020
1 min read
Save for later

Merry Christmas from Blog Posts - SQLServerCentral

Anonymous
25 Dec 2020
1 min read
Christmas is this week so not a technical post for this week.  Just a simple post wishing you and your family as many blessing as possible (especially in the year 2020) and good tidings during this holiday time.  I hope that 2020 wasn’t too harsh on you or anybody close to you.  May the holidays bring you peach and joy! Take care and wear a mask! © 2020, John Morehouse. All rights reserved. The post Merry Christmas first appeared on John Morehouse. The post Merry Christmas appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1372

article-image-daily-coping-25-dec-2020-from-blog-posts-sqlservercentral
Anonymous
25 Dec 2020
1 min read
Save for later

Daily Coping 25 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
25 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to stop for a minute today and smile while you remember a happy moment in 2020. I’ve had more than my share this year, but my happy moment from 2020 that comes to mind is from February. The only airplane trip of the year for me, to celebrate a birthday. Merry Christmas. The post Daily Coping 25 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1105

article-image-retrieving-log-analytics-data-with-data-factory-from-blog-posts-sqlservercentral
Anonymous
24 Dec 2020
6 min read
Save for later

Retrieving Log Analytics Data with Data Factory from Blog Posts - SQLServerCentral

Anonymous
24 Dec 2020
6 min read
I’ve been working on a project where I use Azure Data Factory to retrieve data from the Azure Log Analytics API. The query language used by Log Analytics is Kusto Query Language (KQL). If you know T-SQL, a lot of the concepts translate to KQL. Here’s an example T-SQL query and what it might look like in KQL. --T-SQL: SELECT * FROM dbo.AzureDiagnostics  WHERE TimeGenerated BETWEEN '2020-12-15 AND '2020-12-16' AND database_name_s = 'mydatabasename' //KQL: AzureDiagnostics  | where TimeGenerated between(datetime('2020-12-15') .. datetime('2020-12-16'))  | where database_name_s == 'mydatabasename' For this project, we have several Azure SQL Databases configured to send logs and metrics to a Log Analytics workspace. You can execute KQL queries against the workspace in the Log Analytics user interface in the Azure Portal, a notebook in Azure Data Studio, or directly through the API. The resulting format of the data downloaded from the API leaves something to be desired (it’s like someone shoved a CSV inside a JSON document), but it’s usable after a bit of parsing based upon column position. Just be sure your KQL query actually states the columns and their order (this can be done using the Project operator). You can use an Azure Data Factory copy activity to retrieve the results of a KQL query and land them in an Azure Storage account. You must first execute a web activity to get a bearer token, which gives you the authorization to execute the query. Data Factory pipeline that retrieves data from the Log Analytics API. I had to create an app registration in Azure Active Directory for the web activity to get the bearer token. The web activity should perform a POST to the following url (with your domain populated and without the quotes): "https://login.microsoftonline.com/[your domain]/oauth2/token" Make sure you have added the appropriate header of Content-Type: application/x-www-form-urlencoded. The body should contain your service principal information and identify the resource as "resource=https://api.loganalytics.io". For more information about this step, see the API documentation. Data Factory Copy Activity The source of the copy activity uses the REST connector. The base url is set to "https://api.loganalytics.io/v1/workspaces/[workspace ID]/" (with your workspace ID populated and without the quotes). Authentication is set to Anonymous. Below is my source dataset for the copy activity. Notice that the relative url is set to “query”. ADF Dataset referencing a REST linked service pointing to the Log Analytics API The Source properties of the copy activity should reference this REST dataset. The request method should be POST, and the KQL query should be placed in the request body (more on this below). Two additional headers need to be added in the Source properties. Additional headers in the Source properties of the ADF copy activity The Authorization header should pass a string formatted as “Bearer [Auth Token]” (with a space between the string “Bearer” and the token). The example above retrieves the token from the web activity that executes before the copy activity in the pipeline. Make sure you are securing your inputs and outputs so your secrets and tokens are not being logged in Data Factory. This option is currently found on the General properties of each activity. Embedding a KQL Query in the Copy Activity You must pass the KQL query to the API as a JSON string. But this string is already inside the JSON created by Data Factory. Data Factory is a bit picky in how you enter the query. Here is an example of how to populate the request body in the copy activity. { "query": "AzureDiagnostics | where TimeGenerated between(datetime('2020-12-15') .. datetime('2020-12-16')) | where database_name_s == 'mydatabasename'" } Note that the curly braces are on separate lines, but the query must be on one line. So where I had my query spread across 3 lines in the Log Analytics user interface as shown at the beginning of this post, I have to delete the line breaks for the query to work in Data Factory. The other thing to note is that I am using single quotes to contain string literals. KQL supports either single or double quotes to encode string literals. But using double quotes in your KQL and then putting that inside the double quotes in the request body in ADF leads to errors and frustration (ask me how I know). So make it easy on yourself and use single quotes for any string literals in your KQL query. In my project, we were looping through multiple databases for customized time frames, so my request body is dynamically populated. Below is a request body similar to what I use for my copy activity that retrieves Azure Metrics such as CPU percent and data storage percent. The values come from a lookup activity. In this case, the SQL stored procedure that is executed by the lookup puts the single quotes around the database name so it is returned as ‘mydatabasename’. { "query": "AzureMetrics | where TimeGenerated between (datetime(@{item().TimeStart}) .. datetime(@{item().TimeEnd})) | where Resource == @{item().DatabaseName} | project SourceSystem , TimeGenerated , Resource, ResourceGroup , ResourceProvider , SubscriptionId , MetricName , Total , Count , Maximum , Minimum , TimeGrain , UnitName , Type, ResourceId" } With dynamically populated queries like the above, string interpolation is your friend. Paul Andrew’s post on variable string interpolation in a REST API body helped me understand this and get my API request to produce the required results. You can do similar things with Data Factory to query the Application Insights API. In fact, this blog post on the subject helped me figure out how to get the Log Analytics data I needed. Be Aware of API Limits There are limits to the frequency and amount of data you can pull from the Log Analytics API. As noted in the API documentation: Queries cannot return more than 500,000 rows Queries cannot return more than 64,000,000 bytes (~61 MiB total data) Queries cannot run longer than 10 minutes (3 minutes by default) If there is a risk that you may hit the limit on rows or bytes, you need to be aware that the Log Analytics API does not return an error in this case. It will return the results up to the limit and then note the “partial query failure” in the result set. As far as I can tell, there is no option for pagination, so you will need to adjust your query to keep it under the limits. My current process uses a Get Metadata activity after the copy activity to check file sizes for anything close to the limit and then breaks that query into smaller chunks and re-executes it. It’s All in the Details I had a lot of trial and error as I worked my way through populating the request body in the API call and dealing with API limits. I hope this helps you avoid some of the pitfalls. The post Retrieving Log Analytics Data with Data Factory appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 2285

article-image-daily-coping-24-dec-2020-from-blog-posts-sqlservercentral
Anonymous
24 Dec 2020
2 min read
Save for later

Daily Coping 24 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
24 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to Give away something you have been holding on to. I have made more donations this year and in the past,. Partially I think this is because life slowed down and I had time to clean out some spaces. However, I have more to do, and when I saw this item, I decided to do something new. I’m a big supporter of Habitat for Humanity. During my first sabbatical, I volunteered there quite a bit, and I’ve continued to do that periodically since. I believe shelter is an important resource most people need. site:I’ve had some tools at the house that I’ve held onto, thinking they would be good spares. I have a few cordless items, but I have an older miter saw and a table saw that work fine. Habitat doesn’t take these, but I donated them to another local charity that can make use of them. I’m hoping someone will use them to improve their lives, either building something or maybe using them in their work. The post Daily Coping 24 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1094
article-image-directquery-for-power-bi-datasets-and-azure-analysis-services-preview-from-blog-posts-sqlservercentral
Anonymous
23 Dec 2020
6 min read
Save for later

DirectQuery for Power BI datasets and Azure Analysis Services (preview) from Blog Posts - SQLServerCentral

Anonymous
23 Dec 2020
6 min read
Announced last week is a major new feature for Power BI: you can now use DirectQuery to connect to Azure Analysis Services or Power BI Datasets and combine it with other DirectQuery datasets and/or imported datasets. This is a HUGE improvement that has the Power BI community buzzing! Think of it as the next generation of composite models. Note this requires the December version of Power BI Desktop, and you must go to Options -> Preview features and select “DirectQuery for Power BI datasets and Analysis Services”. You begin with a Power BI dataset that has been published to the Power BI service. In Power BI Desktop you connect to the dataset, where you can build a new model over the original one.  You can extend the original model by adding tables, columns, and measures, and you can also connect to other datasets and combine them into a single semantic model.  While doing this you do not lose any elements from the original model – measures and relationships continue to work. You do not have to worry about anything but the additional data you want to integrate. When the original model is refreshed, your local model also sees any updated information. You work as if you have a local copy of the model and full rights to modify and expand it, even though you are not duplicating any data already stored on the server. This feature is ideal for report authors who want to combine the data from their enterprise semantic model with other data they may own like an Excel spreadsheet, or who want to personalize or enrich the metadata from their enterprise semantic model. This seals the marriage between self-service and corporate BI. The main technology that makes this work is DirectQuery storage mode. DirectQuery will allow composite models to work with live connected sources and other data sources like Excel or SQL Server. Using DirectQuery for Power BI datasets and Azure Analysis Services requires that your report has a local model.  You can start from a live connection to an existing dataset and upgrade to a local model, or start with a DirectQuery connection or imported data, which will automatically create a local model in your report. Live connection is basically a connection to the remote model (the model is not inside of Power BI Desktop).  Converting the remote model to DirectQuery gives you a local model in Power BI Desktop.  So if you want to make changes to your live connection, you will first convert to a DirectQuery connection. If you don’t need to make changes to the remote model and just combine it with other models, you can keep the remote model as a live connection. Keep in mind that when you publish a report with a local model to the service, a dataset for that local model will be published a well.  This is the same behavior as when you publish a report with imported data to the service. When connecting to a remote model, the data for that model is kept in the cloud, and you can join it with another local model with it’s data to create new columns, new measures and new tables without ever moving the data from the remote model to your local PC. It’s an easy way to extend that remote model, which could be managed by IT and refreshed all the time. You are just responsible for managing and refreshing your local model and data. This is how you are combining the efforts of enterprise IT and end-users. With this new feature I can see many companies creating enterprise semantic models in Azure Analysis Services or in a premium instance of Power BI. These semantic models can have the entire business logic of your company using the huge amount of data from your data warehouse. Then users can use DirectQuery against those models and extend those models locally with their own calculations, without having to download any data from the semantic models to Power BI Desktop. This is definitely a game changer. This new feature allows you to do dataset chaining. Chaining allows you to publish a report and a dataset that is based on other Power BI datasets, which was previously not possible. Together, datasets and the datasets and models they are based on form a chain. For example, imagine your colleague publishes a Power BI dataset called Sales and Budget that is based on an Azure Analysis Services model called Sales, and combines it with an Excel sheet called Budget. If you create and publish a new report and dataset called Sales and Budget Europe that is based on the Sales and Budget Power BI dataset published by your colleague, making some further modifications or extensions, you are effectively adding a report and dataset to a chain of length three (the max supported), which started with the Sales Azure Analysis Services model and ends with your Sales and Budget Europe Power BI dataset: This opens a whole new world of possibilities. Before this feature was available, to modify a dataset you would have to get a copy of the pbix file with the dataset and make your own pbix copy, which would also include the data. Also, you were not able to chain models together or to combine datasets (i.e. making models of models). This is quite an improvement! Share your feedback on this new feature at this Power BI Community forum post. More info: New composite models in Power BI: A milestone in Business Intelligence Power BI Direct Query Composite Models = Amazing Composite Models Gen 2 and DirectQuery over Power BI Datasets Power BI Composite Models using Analysis Services -Direct Query Mode Composite models over Power BI datasets and Azure Analysis Services “Composite” arrived – CAUTION! Prologika Newsletter Winter 2020 The post DirectQuery for Power BI datasets and Azure Analysis Services (preview) first appeared on James Serra's Blog. The post DirectQuery for Power BI datasets and Azure Analysis Services (preview) appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1784

article-image-basic-cursors-in-t-sql-sqlnewblogger-from-blog-posts-sqlservercentral
Anonymous
23 Dec 2020
4 min read
Save for later

Basic Cursors in T-SQL–#SQLNewBlogger from Blog Posts - SQLServerCentral

Anonymous
23 Dec 2020
4 min read
Another post for me that is simple and hopefully serves as an example for people trying to get blogging as #SQLNewBloggers. Cursors are not efficient, and not recommended for use in SQL Server/T-SQL. This is different from other platforms, so be sure you know how things work. There are places where cursors are useful, especially in one-off type situations. I recently had a situation, and typed “CREATE CURSOR”, which resulted in an error. This isn’t valid syntax, so I decided to write a quick post to remind myself what is valid. The Basic Syntax Instead of CREATE, a cursor uses DECLARE. The structure is unlike other DDL statements, which are action type name, as CREATE TABLE dbo.MyTable. Instead we have this: DECLARE cursorname CURSOR as in DECLARE myCursor CURSOR There is more that is needed here. This is just the opening. The rest of the structure is DECLARE cursorname CURSOR [options] FOR select_statement You can see this in the docs, but essentially what we are doing is loading the result of a select statement into an object that we can then process row by row. We give the object a name and structure this with the DECLARE CURSOR FOR. I was recently working on the Advent of Code and Day 4 asks for some processing across  rows. As a result, I decided to try a cursor like this: DECLARE pcurs CURSOR FOR SELECT lineval FROM day4 ORDER BY linekey; The next steps are to now process the data in the cursor. We do this by fetching data from the cursor as required. I’ll build up the structure here starting with some housekeeping. In order to use the cursor, we need to open it. It’s good practice to then deallocate the objet at the end, so let’s set up this code: DECLARE pcurs CURSOR FOR SELECT lineval FROM day4 ORDER BY linekey;OPEN pcurs...DEALLOCATE pcurs This gets us a clean structure if the code is re-run multiple times. Now, after the cursor is open, we fetch data from the cursor. Each column in the SELECT statement can be fetched from the cursor into a variable. Therefore, we also need to declare a variable. DECLARE pcurs CURSOR FOR SELECT lineval FROM day4 ORDER BY linekey;OPEN pcursDECLARE @val varchar(1000);FETCH NEXT FROM pcurs into @val...DEALLOCATE pcurs Usually we want to process all rows, so we loop through them. I’ll add a WHILE loop, and use the @@FETCH_STATUS variable. If this is 0, there are still rows in the cursor. If I hit the end of the cursor, a –1 is returned. DECLARE pcurs CURSOR FOR SELECT lineval FROM day4 ORDER BY linekey;OPEN pcursDECLARE @val varchar(1000);FETCH NEXT FROM pcurs into @valWHILE @@FETCH_STATUS = 0 BEGIN ... FETCH NEXT FROM pcurs into @val ENDDEALLOCATE pcurs Where the ellipsis is is where I can do other work, process the value, change it, anything I want to do in T-SQL. I do need to remember to get the next row in the loop. As I mentioned, cursors aren’t efficient and you should avoid them, but there are times when row processing is needed, and a cursor is a good solution to understand. SQLNewBlogger As soon as I realized my mistake in setting up the cursor, I knew some of my knowledge had deteriorated. I decided to take a few minutes and describe cursors and document syntax, mostly for myself. However, this is a way to show why you know something might not be used. You could write a post on replacing a cursor with a set based solution, or even show where performance is poor from a cursor. The post Basic Cursors in T-SQL–#SQLNewBlogger appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1067

article-image-goodbye-pass-from-blog-posts-sqlservercentral
Anonymous
23 Dec 2020
1 min read
Save for later

Goodbye PASS from Blog Posts - SQLServerCentral

Anonymous
23 Dec 2020
1 min read
“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of light, it was the season of darkness, it was the spring of hope, it-> Continue reading Goodbye PASS The post Goodbye PASS appeared first on Born SQL. The post Goodbye PASS appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 2284
article-image-daily-coping-23-dec-2020-from-blog-posts-sqlservercentral
Anonymous
23 Dec 2020
1 min read
Save for later

Daily Coping 23 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
23 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to practice gratitude. List the kind things others have done for you. I’ve thanked people for their help, which is hard for me. I’ve learned to be gracious and accepting of help, but I don’t really like it. I try to do most things for myself in the world. However, I was injured recently, and while I could get around, it was painful. My wife, daughter, and even the kids I coach noticed and helped me out in a few ways: bringing me breakfast in bed getting my computer for me from another room carrying my backpack going to retrieve my phone from another room bringing equipment into the gym and taking it out. I thanked everyone and managed to appreciate their efforts without feeling sorry for myself. I haven’t always been able to do that. The post Daily Coping 23 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1068

article-image-daily-coping-22-dec-2020-from-blog-posts-sqlservercentral
Pushkar Sharma
22 Dec 2020
1 min read
Save for later

Daily Coping 22 Dec 2020 from Blog Posts - SQLServerCentral

Pushkar Sharma
22 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag. Today’s tip is to share a happy memory or inspiring thought with a loved one. Not sure I need to explain, but I did show my kids this one from a celebration.. The post Daily Coping 22 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1133