Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - Databases

233 Articles
article-image-mongodb-withdraws-controversial-server-side-public-license-from-the-open-source-initiatives-approval-process
Richard Gall
12 Mar 2019
4 min read
Save for later

MongoDB withdraws controversial Server Side Public License from the Open Source Initiative's approval process

Richard Gall
12 Mar 2019
4 min read
MongoDB's Server Side Public License was controversial when it was first announced back in October. But the team were, back then, confident that the new license met the Open Source Initiative's approval criteria. However, things seem to have changed. The news that Red Hat was dropping MongoDB over the SSPL in January was a critical blow and appears to have dented MongoDB's ambitions. Last Friday, Co-founder and CTO Eliot Horowitz announced that MongoDB had withdrawn its submission to the Open Source Initiative. Horowitz wrote on the OSI approval mailing list that "the community consensus required to support OSI approval does not currently appear to exist regarding the copyleft provision of SSPL." Put simply, the debate around MongoDB's SSPL appears to have led its leadership to reconsider its approach. Update: this article was amended 19.03.2019 to clarify that the Server Side Public License only requires commercial users (ie. X-as-a-Service products) to open source their modified code. Any other users can still modify and use MongoDB code for free. What's the purpose of MongoDB's Server Side Public License? The Server Side Public License was developed by MongoDB as a means of protecting the project from "large cloud vendors" who want to "capture all of the value but contribute nothing back to the community." Essentially the license included a key modification to section 13 of the standard GPL (General Public License) that governs most open source software available today. You can read the SSPL in full here , but this is the crucial sentence: "If you make the functionality of the Program or a modified version available to third parties as a service, you must make the Service Source Code available via network download to everyone at no charge, under the terms of this License." This would mean that users are free to review, modify, and distribute the software or redistribute modifications to the software. It's only if a user modifies or uses the source code as part of an as-a-service offering that the full service must be open sourced. So essentially, anyone is free to modify MongoDB. It's only when you offer MongoDB as a commercial service that the conditions of the SSPL state that you must open source the entire service. What issues do people have with the Server Side Public License? The logic behind the SSPL seems sound, and probably makes a lot of sense in the context of an open source landscape that's almost being bled dry. But it presents a challenge to the very concept of open source software where the idea that software should be free to use and modify - and, indeed, to profit from - is absolutely central. Moreover, even if it makes sense as a way of defending open source projects from the power of multinational tech conglomerates, it could be argued that the consequences of the license could harm smaller tech companies. As one user on Hacker News explained back in October: "Let [sic] say you are a young startup building a cool SaaS solution. E.g. A data analytics solution. If you make heavy use of MongoDB it is very possible that down the line the good folks at MongoDB come calling since 'the value of your SaaS derives primarily from MongoDB...' So at that point you have two options - buy a license from MongoDB or open source your work (which they can conveniently leverage at no cost)." The Hacker News thread is very insightful on the reasons why the license has been so controversial. Another Hacker News user, for example, described the license as "either idiotic or malevolent." Read next: We need to encourage the meta-conversation around open source, says Nadia Eghbal [Interview] What next for the Server Side Public License? The license might have been defeated but Horowitz and MongoDB are still optimistic that they can find a solution. "We are big believers in the importance of open source and we intend to continue to work with these parties to either refine the SSPL or develop an alternative license that addresses this issue in a way that will be accepted by the broader FOSS community," he said. Whatever happens next, it's clear that there are some significant challenges for the open source world that will require imagination and maybe even some risk-taking to properly solve.
Read more
  • 0
  • 0
  • 17245

article-image-firewall-ports-you-need-to-open-for-availability-groups-from-blog-posts-sqlservercentral
Anonymous
31 Dec 2020
6 min read
Save for later

Firewall Ports You Need to Open for Availability Groups from Blog Posts - SQLServerCentral

Anonymous
31 Dec 2020
6 min read
Something that never ceases to amaze me is the frequent request for help on figuring out what ports are needed for Availability Groups in SQL Server to function properly. These requests come from a multitude of reasons such as a new AG implementation, to a migration of an existing AG to a different VLAN. Whenever these requests come in, it is a good thing in my opinion. Why? Well, that tells me that the network team is trying to instantiate a more secure operating environment by having segregated VLANs and firewalls between the VLANs. This is always preferable to having firewall rules of ANY/ANY (I correlate that kind of firewall rule to granting “CONTROL” to the public server role in SQL Server). So What Ports are Needed Anyway? If you are of the mindset that a firewall rule of ANY/ANY is a good thing or if your Availability Group is entirely within the same VLAN, then you may not need to read any further. Unless, of course, if you have a software firewall (such as Windows Defender / Firewall) running on your servers. If you are in the category where you do need to figure out which ports are necessary, then this article will provide you with a very good starting point. Windows Server Clustering – TCP/UDP Port Description TCP/UDP 53 User & Computer Authentication [DNS] TCP/UDP 88 User & Computer Authentication [Kerberos] UDP 123 Windows Time [NTP] TCP 135 Cluster DCOM Traffic [RPC, EPM] UDP 137 User & Computer Authentication [NetLogon, NetBIOS , Cluster Admin, Fileshare Witness] UDP 138 DSF, Group Policy [DFSN, NetLogon, NetBIOS Datagram Service, Fileshare Witness] TCP 139 DSF, Group Policy [DFSN, NetLogon, NetBIOS Datagram Service, Fileshare Witness] UDP 161 SNMP TCP/UDP 162 SNMP Traps TCP/UDP 389 User & Computer Authentication [LDAP] TCP/UDP 445 User & Computer Authentication [SMB, SMB2, CIFS, Fileshare Witness] TCP/UDP 464 User & Computer Authentication [Kerberos Change/Set Password] TCP 636 User & Computer Authentication [LDAP SSL] TCP 3268 Microsoft Global Catalog TCP 3269 Microsoft Global Catalog [SSL] TCP/UDP 3343 Cluster Network Communication TCP 5985 WinRM 2.0 [Remote PowerShell] TCP 5986 WinRM 2.0 HTTPS [Remote PowerShell SECURE] TCP/UDP 49152-65535 Dynamic TCP/UDP [Defined Company/Policy {CAN BE CHANGED}RPC and DCOM ] * SQL Server – TCP/UDP Port Description TCP 1433 SQL Server/Availability Group Listener [Default Port {CAN BE CHANGED}] TCP/UDP 1434 SQL Server Browser UDP 2382 SQL Server Analysis Services Browser TCP 2383 SQL Server Analysis Services Listener TCP 5022 SQL Server DBM/AG Endpoint [Default Port {CAN BE CHANGED}] TCP/UDP 49152-65535 Dynamic TCP/UDP [Defined Company/Policy {CAN BE CHANGED}] *Randomly allocated UDP port number between 49152 and 65535 So I have a List of Ports, what now? Knowing is half the power, and with great knowledge comes great responsibility – or something like that. In reality, now that know what is needed, the next step is to go out and validate that the ports are open and working. One of the easier ways to do this is with PowerShell. $RemoteServers = "Server1","Server2" $InbndServer = "HomeServer" $TCPPorts = "53", "88", "135", "139", "162", "389", "445", "464", "636", "3268", "3269", "3343", "5985", "5986", "49152", "65535", "1433", "1434", "2383", "5022" $UDPPorts = "53", "88", "123", "137", "138", "161", "162", "389", "445", "464", "3343", "49152", "65535", "1434", "2382" $TCPResults = @() $TCPResults = Invoke-Command $RemoteServers {param($InbndServer,$TCPPorts) $Object = New-Object PSCustomObject $Object | Add-Member -MemberType NoteProperty -Name "ServerName" -Value $env:COMPUTERNAME $Object | Add-Member -MemberType NoteProperty -Name "Destination" -Value $InbndServer Foreach ($P in $TCPPorts){ $PortCheck = (TNC -Port $p -ComputerName $InbndServer ).TcpTestSucceeded If($PortCheck -notmatch "True|False"){$PortCheck = "ERROR"} $Object | Add-Member Noteproperty "$("Port " + "$p")" -Value "$($PortCheck)" } $Object } -ArgumentList $InbndServer,$TCPPorts | select * -ExcludeProperty runspaceid, pscomputername $TCPResults | Out-GridView -Title "AG and WFC TCP Port Test Results" $TCPResults | Format-Table * #-AutoSize $UDPResults = Invoke-Command $RemoteServers {param($InbndServer,$UDPPorts) $test = New-Object System.Net.Sockets.UdpClient; $Object = New-Object PSCustomObject $Object | Add-Member -MemberType NoteProperty -Name "ServerName" -Value $env:COMPUTERNAME $Object | Add-Member -MemberType NoteProperty -Name "Destination" -Value $InbndServer Foreach ($P in $UDPPorts){ Try { $test.Connect($InbndServer, $P); $PortCheck = "TRUE"; $Object | Add-Member Noteproperty "$("Port " + "$p")" -Value "$($PortCheck)" } Catch { $PortCheck = "ERROR"; $Object | Add-Member Noteproperty "$("Port " + "$p")" -Value "$($PortCheck)" } } $Object } -ArgumentList $InbndServer,$UDPPorts | select * -ExcludeProperty runspaceid, pscomputername $UDPResults | Out-GridView -Title "AG and WFC UDP Port Test Results" $UDPResults | Format-Table * #-AutoSize This script will test all of the related TCP and UDP ports that are required to ensure your Windows Failover Cluster and SQL Server Availability Group works flawlessly. If you execute the script, you will see results similar to the following. Data Driven Results In the preceding image, I have combined each of the Gridview output windows into a single screenshot. Highlighted in Red is the result set for the TCP tests, and in Blue is the window for the test results for the UDP ports. With this script, I can take definitive results all in one screen shot and share them with the network admin to try and resolve any port deficiencies. This is just a small data driven tool that can help ensure quicker resolution when trying to ensure the appropriate ports are open between servers. A quicker resolution in opening the appropriate ports means a quicker resolution to the project and all that much quicker you can move on to other tasks to show more value! Put a bow on it This article has demonstrated a meaningful and efficient method to (along with the valuable documentation) test and validate the necessary firewall ports for Availability Groups (AG) and Windows Failover Clustering. With the script provided in this article, you can provide quick and value added service to your project along with providing valuable documentation of what is truly needed to ensure proper AG functionality. Interested in learning about some additional deep technical information? Check out these articles! Here is a blast from the past that is interesting and somewhat related to SQL Server ports. Check it out here. This is the sixth article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post Firewall Ports You Need to Open for Availability Groups first appeared on SQL RNNR. Related Posts: Here is an Easy Fix for SQL Service Startup Issues… December 28, 2020 Connect To SQL Server - Back to Basics March 27, 2019 SQL Server Extended Availability Groups April 1, 2018 Single User Mode - Back to Basics May 31, 2018 Lost that SQL Server Access? May 30, 2018 The post Firewall Ports You Need to Open for Availability Groups appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 8127

article-image-your-quick-introduction-to-extended-events-in-analysis-services-from-blog-posts-sqlservercentral
Anonymous
01 Jan 2021
9 min read
Save for later

Your Quick Introduction to Extended Events in Analysis Services from Blog Posts - SQLServerCentral

Anonymous
01 Jan 2021
9 min read
The Extended Events (XEvents) feature in SQL Server is a really powerful tool and it is one of my favorites. The tool is so powerful and flexible, it can even be used in SQL Server Analysis Services (SSAS). Furthermore, it is such a cool tool, there is an entire site dedicated to XEvents. Sadly, despite the flexibility and power that comes with XEvents, there isn’t terribly much information about what it can do with SSAS. This article intends to help shed some light on XEvents within SSAS from an internals and introductory point of view – with the hopes of getting more in-depth articles on how to use XEvents with SSAS. Introducing your Heavy Weight Champion of the SQLverse – XEvents With all of the power, might, strength and flexibility of XEvents, it is practically next to nothing in the realm of SSAS. Much of that is due to three factors: 1) lack of a GUI, 2) addiction to Profiler, and 3) inadequate information about XEvents in SSAS. This last reason can be coupled with a sub-reason of “nobody is pushing XEvents in SSAS”. For me, these are all just excuses to remain attached to a bad habit. While it is true that, just like in SQL Server, earlier versions of SSAS did not have a GUI for XEvents, it is no longer valid. As for the inadequate information about the feature, I am hopeful that we can treat that excuse starting with this article. In regards to the Profiler addiction, never fear there is a GUI and the profiler events are accessible via the GUI just the same the new XEvents events are accessible. How do we know this? Well, the GUI tells us just as much, as shown here. In the preceding image, I have two sections highlighted with red. The first of note is evidence that this is the gui for SSAS. Note that the connection box states “Group of Olap servers.” The second area of note is the highlight demonstrating the two types of categories in XEvents for SSAS. These two categories, as you can see, are “profiler” and “purexevent” not to be confused with “Purex® event”. In short, yes Virginia there is an XEvent GUI, and that GUI incorporates your favorite profiler events as well. Let’s See the Nuts and Bolts This article is not about introducing the GUI for XEvents in SSAS. I will get to that in a future article. This article is to introduce you to the stuff behind the scenes. In other words, we want to look at the metadata that helps govern the XEvents feature within the sphere of SSAS. In order to, in my opinion, efficiently explore the underpinnings of XEvents in SSAS, we first need to setup a linked server to make querying the metadata easier. EXEC master.dbo.sp_addlinkedserver @server = N'SSASDIXNEUFLATIN1' --whatever LinkedServer name you desire , @srvproduct=N'MSOLAP' , @provider=N'MSOLAP' , @datasrc=N'SSASServerSSASInstance' --change your data source to an appropriate SSAS instance , @catalog=N'DemoDays' --change your default database go EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'SSASDIXNEUFLATIN1' , @useself=N'False' , @locallogin=NULL , @rmtuser=NULL , @rmtpassword=NULL GO Once the linked server is created, you are primed and ready to start exploring SSAS and the XEvent feature metadata. The first thing to do is familiarize yourself with the system views that drive XEvents. You can do this with the following query. SELECT lq.* FROM OPENQUERY(SSASDIXNEUFLATIN1, 'SELECT * FROM $system.dbschema_tables') as lq WHERE CONVERT(VARCHAR(100),lq.TABLE_NAME) LIKE '%XEVENT%' OR CONVERT(VARCHAR(100),lq.TABLE_NAME) LIKE '%TRACE%' ORDER BY CONVERT(VARCHAR(100),lq.TABLE_NAME); When the preceding query is executed, you will see results similar to the following. In this image you will note that I have two sections highlighted. The first section, in red, is the group of views that is related to the trace/profiler functionality. The second section, in blue, is the group of views that is related the XEvents feature in SSAS. Unfortunately, this does demonstrate that XEvents in SSAS is a bit less mature than what one may expect and definitely shows that it is less mature in SSAS than it is in the SQL Engine. That shortcoming aside, we will use these views to explore further into the world of XEvents in SSAS. Exploring Further Knowing what the group of tables looks like, we have a fair idea of where we need to look next in order to become more familiar with XEvents in SSAS. The tables I would primarily focus on (at least for this article) are: DISCOVER_TRACE_EVENT_CATEGORIES, DISCOVER_XEVENT_OBJECTS, and DISCOVER_XEVENT_PACKAGES. Granted, I will only be using the DISCOVER_XEVENT_PACKAGES view very minimally. From here is where things get to be a little more tricky. I will take advantage of temp tables  and some more openquery trickery to dump the data in order to be able to relate it and use it in an easily consumable format. Before getting into the queries I will use, first a description of the objects I am using. DISCOVER_TRACE_EVENT_CATEGORIES is stored in XML format and is basically a definition document of the Profiler style events. In order to consume it, the XML needs to be parsed and formatted in a better format. DISCOVER_XEVENT_PACKAGES is the object that lets us know what area of SSAS the event is related to and is a very basic attempt at grouping some of the events into common domains. DISCOVER_XEVENT_OBJECTS is where the majority of the action resides for Extended Events. This object defines the different object types (actions, targets, maps, messages, and events – more on that in a separate article). Script Fun Now for the fun in the article! IF OBJECT_ID('tempdb..#SSASXE') IS NOT NULL BEGIN DROP TABLE #SSASXE; END; IF OBJECT_ID('tempdb..#SSASTrace') IS NOT NULL BEGIN DROP TABLE #SSASTrace; END; SELECT CONVERT(VARCHAR(MAX), xo.Name) AS EventName , xo.description AS EventDescription , CASE WHEN xp.description LIKE 'SQL%' THEN 'SSAS XEvent' WHEN xp.description LIKE 'Ext%' THEN 'DLL XEvents' ELSE xp.name END AS PackageName , xp.description AS CategoryDescription --very generic due to it being the package description , NULL AS CategoryType , 'XE Category Unknown' AS EventCategory , 'PureXEvent' AS EventSource , ROW_NUMBER() OVER (ORDER BY CONVERT(VARCHAR(MAX), xo.name)) + 126 AS EventID INTO #SSASXE FROM ( SELECT * FROM OPENQUERY (SSASDIXNEUFLATIN1, 'select * From $system.Discover_Xevent_Objects') ) xo INNER JOIN ( SELECT * FROM OPENQUERY (SSASDIXNEUFLATIN1, 'select * FROM $system.DISCOVER_XEVENT_PACKAGES') ) xp ON xo.package_id = xp.id WHERE CONVERT(VARCHAR(MAX), xo.object_type) = 'event' AND xp.ID <> 'AE103B7F-8DA0-4C3B-AC64-589E79D4DD0A' ORDER BY CONVERT(VARCHAR(MAX), xo.[name]); SELECT ec.x.value('(./NAME)[1]', 'VARCHAR(MAX)') AS EventCategory , ec.x.value('(./DESCRIPTION)[1]', 'VARCHAR(MAX)') AS CategoryDescription , REPLACE(d.x.value('(./NAME)[1]', 'VARCHAR(MAX)'), ' ', '') AS EventName , d.x.value('(./ID)[1]', 'INT') AS EventID , d.x.value('(./DESCRIPTION)[1]', 'VARCHAR(MAX)') AS EventDescription , CASE ec.x.value('(./TYPE)[1]', 'INT') WHEN 0 THEN 'Normal' WHEN 1 THEN 'Connection' WHEN 2 THEN 'Error' END AS CategoryType , 'Profiler' AS EventSource INTO #SSASTrace FROM ( SELECT CONVERT(XML, lq.[Data]) FROM OPENQUERY (SSASDIXNEUFLATIN1, 'Select * from $system.Discover_trace_event_categories') lq ) AS evts(event_data) CROSS APPLY event_data.nodes('/EVENTCATEGORY/EVENTLIST/EVENT') AS d(x) CROSS APPLY event_data.nodes('/EVENTCATEGORY') AS ec(x) ORDER BY EventID; SELECT ISNULL(trace.EventCategory, xe.EventCategory) AS EventCategory , ISNULL(trace.CategoryDescription, xe.CategoryDescription) AS CategoryDescription , ISNULL(trace.EventName, xe.EventName) AS EventName , ISNULL(trace.EventID, xe.EventID) AS EventID , ISNULL(trace.EventDescription, xe.EventDescription) AS EventDescription , ISNULL(trace.CategoryType, xe.CategoryType) AS CategoryType , ISNULL(CONVERT(VARCHAR(20), trace.EventSource), xe.EventSource) AS EventSource , xe.PackageName FROM #SSASTrace trace FULL OUTER JOIN #SSASXE xe ON trace.EventName = xe.EventName ORDER BY EventName; Thanks to the level of maturity with XEvents in SSAS, there is some massaging of the data that has to be done so that we can correlate the trace events to the XEvents events. Little things like missing EventIDs in the XEvents events or missing categories and so forth. That’s fine, we are able to work around it and produce results similar to the following. If you compare it to the GUI, you will see that it is somewhat similar and should help bridge the gap between the metadata and the GUI for you. Put a bow on it Extended Events is a power tool for many facets of SQL Server. While it may still be rather immature in the world of SSAS, it still has a great deal of benefit and power to offer. Getting to know XEvents in SSAS can be a crucial skill in improving your Data Superpowers and it is well worth the time spent trying to learn such a cool feature. Interested in learning more about the depth and breadth of Extended Events? Check these out or check out the XE website here. Want to learn more about your indexes? Try this index maintenance article or this index size article. This is the seventh article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post Your Quick Introduction to Extended Events in Analysis Services first appeared on SQL RNNR. Related Posts: Extended Events Gets a New Home May 18, 2020 Profiler for Extended Events: Quick Settings March 5, 2018 How To: XEvents as Profiler December 25, 2018 Easy Open Event Log Files June 7, 2019 Azure Data Studio and XEvents November 21, 2018 The post Your Quick Introduction to Extended Events in Analysis Services appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 7895

article-image-logging-the-history-of-my-past-sql-saturday-presentations-from-blog-posts-sqlservercentral
Anonymous
31 Dec 2020
3 min read
Save for later

Logging the history of my past SQL Saturday presentations from Blog Posts - SQLServerCentral

Anonymous
31 Dec 2020
3 min read
(2020-Dec-31) PASS (formerly known as the Professional Association for SQL Server) is the global community for data professionals who use the Microsoft data platform. On December 17, 2020 PASS announced that because of COVID-19, they were ceasing all operations effective January 15, 2021. PASS has offered many training and networking opportunities, one of such training streams was SQL Saturday. PASS SQL Saturday was free training events were designed to expand knowledge sharing and learning experience for data professionals. Photo by Daniil Kuželev on Unsplash Since the content and historical records of SQL Saturday soon will become unavailable, I decided to log the history of all my past SQL Saturday presentations. To create this table I give full credit to André Kamman and Rob Sewell, that extracted and saved this information here: https://sqlsathistory.com/. My SQL Saturday history Date Name Location Track Title 2016/04/16 SQLSaturday #487 Ottawa 2016 Ottawa Analytics and Visualization Excel Power Map vs. Power BI Globe Map visualization 2017/01/03 SQLSaturday #600 Chicago 2017 Addison BI Information Delivery Power BI with Narrative Science: Look Who's Talking! 2017/09/30 SQLSaturday #636 Pittsburgh 2017 Oakdale BI Information Delivery Geo Location of Twitter messages in Power BI 2018/09/29 SQLSaturday #770 Pittsburgh 2018 Oakdale BI Information Delivery Power BI with Maps: Choose Your Destination 2019/02/02 SQLSaturday #821 Cleveland 2019 Cleveland Analytics Visualization Power BI with Maps: Choose Your Destination 2019/05/10 SQLSaturday #907 Pittsburgh 2019 Oakdale Cloud Application Development Deployment Using Azure Data Factory Mapping Data Flows to load Data Vault 2019/07/20 SQLSaturday #855 Albany 2019 Albany Business Intelligence Power BI with Maps: Choose Your Destination 2019/08/24 SQLSaturday #892 Providence 2019 East Greenwich Cloud Application Development Deployment Continuous integration and delivery (CI/CD) in Azure Data Factory 2020/01/02 SQLSaturday #930 Cleveland 2020 Cleveland Database Architecture and Design Loading your Data Vault with Azure Data Factory Mapping Data Flows 2020/02/29 SQLSaturday #953 Rochester 2020 Rochester Application Database Development Loading your Data Vault with Azure Data Factory Mapping Data Flows Closing notes I think I have already told this story a couple of times. Back in 2014 - 2015, I started to attend SQL Saturday training events in the US by driving from Toronto. At that time I only spoke a few times at our local user group and had never presented at SQL Saturdays.  So while I was driving I needed to pass a custom control at the US border and a customs officer would usually ask me a set of questions about the place of my work, my citizenship, and the destination of my trip. I answered him that I was going to attend an IT conference, called SQL Saturday, a free event for data professionals. At that point, the customs officer positively challenged me and told me that I needed to start teaching others based on my long experience in IT, we laughed, and then he let me pass the border.  I’m still very thankful to that US customs officers for this positive affirmation. SQL Saturdays have been a great journey for me! The post Logging the history of my past SQL Saturday presentations appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 7146

article-image-storage-savings-with-table-compression-from-blog-posts-sqlservercentral
Anonymous
31 Dec 2020
2 min read
Save for later

Storage savings with Table Compression from Blog Posts - SQLServerCentral

Anonymous
31 Dec 2020
2 min read
In one of my recent assignments, my client asked me for a solution, to reduce the disk space requirement, of the staging database of an ETL workload. It made me study and compare the Table Compression feature of SQL Server. This article will not explain Compression but will compare the storage and performance aspects of Compressed vs Non Compressed tables. I found a useful article on Compression written by Gerald Britton. It’s quite comprehensive and covers most of the aspects of Compression. For my POC, I made use of the SSIS package. I kept 2 data flows, with the same table and file structure, but one with Table Compression enabled and another without Table Compression. Table and file had around 100 columns with only VARCHAR datatype, since the POC was for Staging database, to temporarily hold the raw data from flat files. I’d to also work on the conversion of flat file source output columns, to make it compatible with the destination SQL Server table structure. The POC was done with various file sizes because we also covered the POC for identifying the optimal value for file size. So we did 2 things in a single POC – Comparison of Compression and finding the optimal file size for the ETL process. The POC was very simple, with 2 data flows. Both had flat files as source and SQL Server table as the destination. Here is the comparison recorded post POC. I think you would find it useful in deciding if it’s worth implementing Compression in your respective workload. Findings Space-saving: Approx. 87% of space-saving. Write execution time: No difference. Read execution time: Slight / negligible difference. The plain SELECT statement was executed for comparing the Read execution time. The Compressed table took 10-20 seconds more, which is approx. <2%. As compared to the disk space saved, this slight overhead was acceptable in our workload. However, you need to review thoroughly your case before taking any decision. The post Storage savings with Table Compression appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 6835

article-image-creating-an-html-url-from-a-powershell-string-sqlnewblogger-from-blog-posts-sqlservercentral
Anonymous
30 Dec 2020
2 min read
Save for later

Creating an HTML URL from a PowerShell String–#SQLNewBlogger from Blog Posts - SQLServerCentral

Anonymous
30 Dec 2020
2 min read
Another post for me that is simple and hopefully serves as an example for people trying to get blogging as #SQLNewBloggers. I wrote about getting a quick archive of SQL Saturday data last week, and while doing that, I had some issues building the HTML needed in PowerShell. I decided to work through this a bit and determine what was wrong. My original code looked like this: $folder = "E:DocumentsgitSQLSatArchiveSQLSatArchiveSQLSatArchiveClientApppublicAssetsPDF" $code = "" $list = Get-ChildItem -Path $folder ForEach ($File in $list) { #write-host($File.name) $code = $code + "<li><a href=$($File.Name)>$($File.BaseName)</a></li>" } write-host($code) This gave me the code I needed, which I then edited in SSMS to get the proper formatting. However, I knew this needed to work. I had used single quotes and then added in the slashes, but that didn’t work. This code: $folder = "E:DocumentsgitSQLSatArchiveSQLSatArchiveSQLSatArchiveClientApppublicAssetsPDF" $code = "" $list = Get-ChildItem -Path $folder ForEach ($File in $list) { #write-host($File.name) $code = $code + '<li><a href="/Assets/PDF/$($File.Name)" >$($File.BaseName)</a></li>' } write-host($code) produced this type of output: <li><a href="/Assets/PDF/$($File.Name)" >$($File.BaseName)</a></li> Not exactly top notch HTML. I decided that I should look around. I found a post on converting some data to HTML, which wasn’t what I wanted, but it had a clue in there. The double quotes. I needed to escape quotes here, as I wanted the double quotes around my string. I changed the line building the string to this: $code = $code + "<li><a href=""/Assets/PDF/$($File.Name)"" >$($File.BaseName)</a></li>" And I then had what I wanted: <li><a href="/Assets/PDF/1019.pdf" >1019</a></li> Strings in PoSh can be funny, so a little attention to escaping things and knowing about variables and double quotes is helpful. SQLNewBlogger This was about 15 minutes of messing with Google and PoSh to solve, but then only about 10 minutes to write up. A good example that shows some research, initiative, and investigation in addition to solving a problem. The post Creating an HTML URL from a PowerShell String–#SQLNewBlogger appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 6814
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-daily-coping-31-dec-2020-from-blog-posts-sqlservercentral
Anonymous
31 Dec 2020
2 min read
Save for later

Daily Coping 31 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
31 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to plan some new acts of kindness to do in 2021. As I get older, I do try to spend more time volunteering and helping others more than myself. I’ve had success, my children are adults, and I find less “wants” for myself than I feel the impetus to help others more. I also hope more people feel this, perhaps at a younger age than I am. In any case, I have a couple things for 2021 that I’d like to do: Random acts – I saw this in a movie or show recently, but someone was buying a coffee or something small for a stranger once a week. I need to do that, especially if I get the chance to go out again. DataSaturdays – The demise of PASS means more support for people that might want to run an event, so I need to be prepared to help others again. Coaching – I have been coaching kids, but they’ve been privileged kids. I’d like to switch to kids that lack some of the support and privileges of the kids I usually deal with. I’m hoping things get moving with sports again and I get the chance to talk to the local Starlings program. The post Daily Coping 31 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 6394

article-image-here-is-an-easy-fix-for-sql-service-startup-issues-when-using-an-msa-from-blog-posts-sqlservercentral
Anonymous
28 Dec 2020
6 min read
Save for later

Here is an Easy Fix for SQL Service Startup Issues When Using an MSA from Blog Posts - SQLServerCentral

Anonymous
28 Dec 2020
6 min read
So far, the articles in this years 12 Days of Christmas series first and second days) have shown some relatively easy fixes for some rather annoying problems. In this article, I will continue that same pattern. In this article, I will share another frustrating problem related to both Kerberos and Managed Service Accounts (MSAs). What is an MSA you might ask? That is a good question. If you are unacquainted with this special type of service account, you have been missing out. An MSA is a special account managed by the domain controller. It is an account that is used to run a service on a single computer (note a Group Managed Service Account, or gMSA, is another type of MSA that can be used on multiple computers). Since the password is managed by the domain and is frequently changed, humans won’t know the password and will not be able to use it to log on to a computer. This type of service account is preferred because of the improved security on the account. That sounds great, right? It is, until you encounter some of the oddball failures and randomness of the service not starting as you might expect. Wayne Sheffield addressed some of the problems relating to the service starting in his article here. As luck would have it, there is a similar problem that needs just a little more action on our part in order to nudge the service to be able to start consistently. Fix SQL Service not Starting after Reboot when using an MSA As a general rule of thumb, I recommend adding the service dependencies to your SQL Server Service. These service dependencies are (as mentioned in Wayne’s article): Netlogon, KEYISO, and W32Time. Additionally, I recommend that you double check the dependency for the SQL Server Agent to ensure it is dependent upon SQL Server Service. Most of the time, the agent service will already have the SQL Server Service dependency and the SQL Server Service will have the KEYISO dependency. However, that is not a foregone conclusion. Occasionally, those dependencies are missing. Netlogon and W32Time are almost always missing so you will need to add those dependencies most of the time. I won’t go into further detail about setting these dependencies today. Wayne showed how to use regedit to set the dependencies. That said, these dependencies should be set prior to setting the “fix” I will mention shortly in this article. In addition, stay tuned because I will be showing in the near future how to set those dependencies via PowerShell. Occasional Errors after Reboot The problem of SQL Server Service not starting after a reboot when using an MSA is ridiculously frustrating. Why? Well, you get very little information as to what the problem is. The event viewer just shows (most of the time) that the service stopped and doesn’t give an error code. In fact it just shows it as an informational event. Every once in a while, you may get a gold nugget of a clue with a message similar to the following: NETLOGON error: This computer was not able to set up a secure session with a domain controller in domain TurdWilligers due to the following: There are currently no logon servers available to service the logon request. This may lead to authentication problems. Make sure that this computer is connected to the network. If the problem persists, please contact your domain administrator. ADDITIONAL INFO If this computer is a domain controller for the specified domain, it sets up the secure session to the primary domain controller emulator in the specified domain. Otherwise, this computer sets up the secure session to any domain controller in the specified domain. Despite the service dependency on Netlogon, you can still see SQL Server try to start before the Netlogon service is fully functional. This is disastrous, right? Not exactly. Despite the flawed attempt to start a bit quick there, we still have our methods to avoid this problem in the future. You will laugh at how ridiculously easy this fix is. Are you ready for it? And now the Fix The fix is (after setting the service dependencies mentioned previously) to set the SQL Server Service to “Automatic (Delayed Start)” instead of “Automatic” as shown in the following image. All that needs to be done is this quick and simple change in the services control panel (not SQL Server Configuration Manager). Once applied, reboot the server to validate that it worked. TADA – problem solved. Now, take this fix and make your environment consistent. Or not. The choice is yours. Personally, I prefer to have my environment as consistent as absolutely possible. There are times when that is not plausible and it is these times that it is very highly recommended to DOCUMENT what is different and the reasons as to why it is different. This makes your job significantly easier and gives you back a significant amount of time that you will grow to appreciate more and more over time. Put a bow on it Running into startup issues with your services could be a nightmare on some days. Couple those issues with an MSA and you may be looking at a bald spot on your head before long. Never fear, this article demonstrates that startup issues involving MSAs can be an extremely simple condition to resolve. The most benign of changes affecting your service startup can be the difference between SQL Server starting normally after an unexpected reboot or SQL Server being down until somebody “notices” the problem. As mentioned previously in this article, please stay tuned for future tips on how to simplify adding the service dependencies. It should prove a useful tool in your arsenal, helping you to become more efficient and elite. Enjoyed reading some of these security related articles? Security is a fantastic topic and there is plenty to learn in regards to security. Try this article for more related to Kerberos issues. Or maybe you want something a little more in your face about security? Try this back to basics article about public role permissions. This is the third article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post Here is an Easy Fix for SQL Service Startup Issues When Using an MSA first appeared on SQL RNNR. Related Posts: CRM Data Source Connection Error January 23, 2020 SHUTDOWN SQL Server December 3, 2018 Single User Mode - Back to Basics May 31, 2018 The Gift of the SPN December 10, 2019 How To Resolve User Error in Kerberos Configuration Manager December 26, 2020 The post Here is an Easy Fix for SQL Service Startup Issues When Using an MSA appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 4535

article-image-how-to-easily-grant-permissions-to-all-databases-from-blog-posts-sqlservercentral
Anonymous
29 Dec 2020
6 min read
Save for later

How to Easily Grant Permissions to all Databases from Blog Posts - SQLServerCentral

Anonymous
29 Dec 2020
6 min read
A recurring need that I have seen is a means to grant a user or group of users access to all databases in one fell swoop. Recently, I shared an introductory article to this requirement. In this article, I will demonstrate how to easily grant permissions to all databases via the use of Server Roles. When talking about Server Roles, I don’t mean the Fixed Server Roles. It would be crazy easy, insane, and stupid to just use the fixed server roles to grant users access to all databases. Why? Well, only two of the fixed server roles would cover the permission scope needed by most users to access a database – and use said database. Those roles are sysadmin and securityadmin. The Bad, the Bad, and the Ugly The sysadmin role should be fairly obvious and is generally what every vendor and a majority of developers insists on having. We all know how dangerous and wrong that would be. The securityadmin fixed server role on the other hand is less obvious. That said, securityadmin can grant permissions and should therefore be treated the same as sysadmin. By no means do we ever want to grant access (as a shortcut / easy method) via these roles, that would be security-defeating. There is one more role that seems to be a popular choice – the public role. Visualize a child’s eyes rolling into the back of their head and you have my reaction to this option. This is the ugly of the options but cannot go without mentioning because I deal with vendors on a regular basis that continue to insist on doing things this way. This is not necessarily the easy method because you have to manually grant permissions to the public fixed server role, so this comes with some work but it is just flat stupid and lazy to grant all of these permissions to the public role. Here is an article on this absurd method. Create Custom Server Roles for all DB Access The pre-requisite for there to be an easy button is to create your own Server-Level role. I demonstrated how to do this in a previous article and will glaze over it quickly again here. IF NOT EXISTS ( SELECT name FROM sys.server_principals WHERE name = 'Gargouille' ) BEGIN CREATE LOGIN [Gargouille] WITH PASSWORD = N'SuperDuperLongComplexandHardtoRememberPasswordlikePassw0rd1!' , DEFAULT_DATABASE = [] , CHECK_EXPIRATION = OFF , CHECK_POLICY = OFF; END; --check for the server role IF NOT EXISTS ( SELECT name FROM sys.server_principals WHERE name = 'SpyRead' AND type_desc = 'SERVER_ROLE' ) BEGIN CREATE SERVER ROLE [SpyRead] AUTHORIZATION [securityadmin]; GRANT CONNECT ANY DATABASE TO [SpyRead]; END; USE master; GO IF NOT EXISTS ( SELECT mem.name AS MemberName FROM sys.server_role_members rm INNER JOIN sys.server_principals sp ON rm.role_principal_id = sp.principal_id LEFT OUTER JOIN sys.server_principals mem ON rm.member_principal_id = mem.principal_id WHERE sp.name = 'SpyRead' AND sp.type_desc = 'SERVER_ROLE' AND mem.name = 'Gargouille' ) BEGIN ALTER SERVER ROLE [SpyRead] ADD MEMBER [Gargouille]; END; In this demo script, I have created a user and a server-level role, then added that user to the role. The only permission currently on the server-level role is “Connect Any Database”. Now, let’s say we need to be able to grant permissions to all databases for this user to be able to read (that would be SELECT in SQL terms) data. The only thing I need to do for the role would be to make this permission change. GRANT SELECT ALL USER SECURABLES TO [SpyRead]; That is a far simpler approach, right? Let’s see how it might look to add a user to the role from SQL Server Management Studio (SSMS). After creating a custom server role, you will be able to see it from the login properties page and then add the login directly to the role from the gui. That makes the easy button just a little bit better. Test it out Now, let’s test the permission to select from a database. EXECUTE AS LOGIN = 'Gargouille'; GO USE []; GO -- no permissions on server state SELECT * FROM sys.dm_os_wait_stats; GO --Yet can select from any database SELECT USER_NAME() SELECT * FROM sys.objects REVERT Clearly, you will need to change your database name unless by some extreme chance you also have a database by the name of . Testing this will prove that the user can connect to the database and can also select data from that database. Now this is where it gets a little dicey. Suppose you wish to grant the delete option (not a super wise idea to be honest) to a user in every database. That won’t work with this method. You would need to grant those permissions on a per case basis. Where this solution works best is for permissions that are at the server scope. Permissions at this scope include the things such as “Control Server”, “View any Definition”, “View Server State”, and “Select all USER Securables”. This isn’t a complete list but just enough to give you an idea. That said, how often do you really need to have a user be able to change data in EVERY database on a server? I certainly hope your security is not setup in such a fashion. Caveat Suppose you decide to utilize the permission “SELECT ALL USER SECURABLES”, there is an additional feature that comes with it. This permission can be used to deny SELECT permission against all databases as well. As a bonus, it works to block sysadmins as well – sort of. It does deny the SELECT permission, unlike other methods, when applied to a sysadmin, however, any sysadmin worth their salt can easily revoke that permission because they have “CONTROL” server permission. That said, it would be a worthwhile trick to play on your junior dbas to see what they do. Put a bow on it As Data Professionals, we are always trying to find more efficient ways of doing the job. Sometimes, against our best advice, we are required to find a more efficient way to give users access to more than they probably should have. This article demonstrates one method to easily grant READ access to all databases while still keeping the environment secure and hitting that chord of having done it more efficiently. Interested in learning about some deep technical information? Check these out! Want to learn more about your indexes? Try this index maintenance article or this index size article. This is the fifth article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post How to Easily Grant Permissions to all Databases first appeared on SQL RNNR. Related Posts: Server-Level Roles - Back to Basics November 20, 2020 When Too Much is Not a Good Thing December 13, 2019 SQL Server User Already Exists - Back to Basics January 24, 2018 SHUTDOWN SQL Server December 3, 2018 Who needs data access? Not You! December 12, 2019 The post How to Easily Grant Permissions to all Databases appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 4420

article-image-experiments-with-go-arrays-and-slices-from-blog-posts-sqlservercentral
Anonymous
30 Dec 2020
5 min read
Save for later

Experiments With Go Arrays and Slices from Blog Posts - SQLServerCentral

Anonymous
30 Dec 2020
5 min read
Simplicity Over Syntactic Sugar As I’ve been learning Go, I’ve grown to learn that many decisions to simplify the language have removed many features that provide more succinct expressions in languages such as Python, PowerShell, C#, and others. The non-orthogonal features in the languages result in many expressive ways something can be done, but at a cost, according to Go’s paradigm. My background is also heavily focused in relational databases and set based work, so I’m realizing as I study more programming paradigms seperate from any database involvement, that it’s a fundamental difference in the way a database developer and a normal developer writing backend code look at this. Rather than declarative based syntax, you need to focus a lot more on iterating through collections and manipulating these. As I explored my assumptions, I found that even in .NET Linq expressions are abstracting the same basic concept of loops and iterations away for simpler syntax, but not fundamentally doing true set selections. In fact, in some cases I’ve read that Linq performance is often worse than a simple loop (see this interesting stack overflow answer) The catch to this is that the Linq expression might be more maintainable in an enterprise environment at the cost of some degraded performance (excluding some scenarios like deferred execution). For example, in PowerShell, you can work with arrays in a multitude of ways. $array[4..10] | ForEach-Object {} # or foreach($item in $array[$start..$end]){} This syntactic sugar provides brevity, but as two ways among many I can think of this does add such a variety of ways and performance considerations. Go strips this cognitive load away by giving only a few ways to do the same thing. Using For Loop This example is just int slices, but I’m trying to understand the options as I range through a struct as well. When working through these examples for this question, I discovered thanks to the Rubber Duck debugging, that you can simplify slice selection using newSlice := arr[2:5]. Simple Loop As an example: Goplay Link To Run package main import "fmt" func main() { startIndex := 2 itemsToSelect := 3 arr := []int{10, 15, 20, 25, 35, 45, 50} fmt.Printf("starting: arr: %vn", arr) newCollection := []int{} fmt.Printf("initialized newCollection: %vn", newCollection) for i := 0; i < itemsToSelect; i++ { newCollection = append(newCollection, arr[i+startIndex]) fmt.Printf("tnewCollection: %vn", newCollection) } fmt.Printf("= newCollection: %vn", newCollection) fmt.Print("expected: 20, 25, 35n") }``` This would result in: ```text starting: arr: [10 15 20 25 35 45 50] initialized newCollection: [] newCollection: [20] newCollection: [20 25] newCollection: [20 25 35] = newCollection: [20 25 35] expected: 20, 25, 35 Moving Loop to a Function Assuming there are no more effective selection libraries in Go, I’m assuming I’d write functions for this behavior such as Goplay Link To Run. package main import "fmt" func main() { startIndex := 2 itemsToSelect := 3 arr := []int{10, 15, 20, 25, 35, 45, 50} fmt.Printf("starting: arr: %vn", arr) newCollection := GetSubselection(arr, startIndex, itemsToSelect) fmt.Printf("GetSubselection returned: %vn", newCollection) fmt.Print("expected: 20, 25, 35n") } func GetSubselection(arr []int, startIndex int, itemsToSelect int) (newSlice []int) { fmt.Printf("newSlice: %vn", newSlice) for i := 0; i < itemsToSelect; i++ { newSlice = append(newSlice, arr[i+startIndex]) fmt.Printf("tnewSlice: %vn", newSlice) } fmt.Printf("= newSlice: %vn", newSlice) return newSlice } which results in: starting: arr: [10 15 20 25 35 45 50] newSlice: [] newSlice: [20] newSlice: [20 25] newSlice: [20 25 35] = newSlice: [20 25 35] GetSubselection returned: [20 25 35] expected: 20, 25, 35 Trimming this down further I found I could use the slice syntax (assuming the consecutive range of values) such as: Goplay Link To Run func GetSubselection(arr []int, startIndex int, itemsToSelect int) (newSlice []int) { fmt.Printf("newSlice: %vn", newSlice) newSlice = arr[startIndex:(startIndex + itemsToSelect)] fmt.Printf("tnewSlice: %vn", newSlice) fmt.Printf("= newSlice: %vn", newSlice) return newSlice } Range The range expression gives you both the index and value, and it works for maps and structs as well. Turns outs you can also work with a subselection of a slice in the range expression. package main import "fmt" func main() { startIndex := 2 itemsToSelect := 3 arr := []int{10, 15, 20, 25, 35, 45, 50} fmt.Printf("starting: arr: %vn", arr) fmt.Printf("Use range to iterate through arr[%d:(%d + %d)]n", startIndex, startIndex, itemsToSelect) for i, v := range arr[startIndexstartIndex + itemsToSelect)] { fmt.Printf("ti: %d v: %dn", i, v) } fmt.Print("expected: 20, 25, 35n") } Slices While the language is simple, understanding some behaviors with slices caught me off-guard. First, I needed to clarify my language. Since I was looking to have a subset of an array, slices were the correct choice. For a fixed set with no changes, a standard array would be used. Tour On Go says it well with: An array has a fixed size. A slice, on the other hand, is a dynamically-sized, flexible view into the elements of an array. In practice, slices are much more common than arrays. For instance, I tried to think of what I would do to scale performance on a larger array, so I used a pointer to my int array. However, I was using a slice. This means that using a pointer wasn’t valid. This is because whenever I pass the slice it is a pass by reference already, unlike many of the other types. newCollection := GetSubSelection(&arr,2,3) func GetSubSelection(arr *[]int){ ... I think some of these behaviors aren’t quite intuitive to a new Gopher, but writing them out helped clarify the behavior a little more. Resources This is a bit of a rambling about what I learned so I could solidify some of these discoveries by writing them down. #learninpublic For some great examples, look at some examples in: A Tour Of Go - Slices Go By Example Prettyslice GitHub Repo If you have any insights, feel free to drop a comment here (it’s just a GitHub powered comment system, no new account required). #powershell #tech #golang #development The post Experiments With Go Arrays and Slices appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 4349
article-image-hope-from-blog-posts-sqlservercentral
Anonymous
31 Dec 2020
2 min read
Save for later

Hope! from Blog Posts - SQLServerCentral

Anonymous
31 Dec 2020
2 min read
2020 was a rough year. We’ve had friends and family leave us. Jobs lost. Health scares a plenty and that’s without counting a global pandemic. The end of PASS. US politics has been .. nail biting to say the very least. All around it’s just been a tough year. On the other hand, I’m still alive, and if you are reading this so are you. There are vaccines becoming available for Covid and it looks like the US government may not try to kill us all off in 2021. Several people I know have had babies! I’ve lost over 50 lbs! (Although I absolutely do not recommend my methods.) Microsoft is showing it’s usual support for the SQL Server community and the community itself is rallying together and doing everything they can to salvage resources from PASS. And we are still and always a community that thrives on supporting each other. 2020 was a difficult year. But there is always, that most valuable thing. Hope. A singer/songwriter I follow on youtube did a 2020 year in review song. It’s worth watching just for her amazing talent and beautiful voice, but at about 4:30 she makes a statement that really resonated with me. There’s life in between the headlines and fear.The little victories made this year.No matter what happens we keep doing good. -Is that all we have?Yes and we always should!There’s nothing you can’t overcome. https://www.youtube.com/watch?v=z9xwXJvXBIw So for this new year I wish all of you that most precious of gifts. Hope. The post Hope! appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 4311

article-image-time-to-set-service-dependencies-for-sql-server-its-easy-from-blog-posts-sqlservercentral
Anonymous
29 Dec 2020
6 min read
Save for later

Time to Set Service Dependencies for SQL Server, it’s Easy from Blog Posts - SQLServerCentral

Anonymous
29 Dec 2020
6 min read
In the previous article I mentioned the need for setting certain dependencies for SQL Services startup. As promised, this article is a follow-up to that article to help you easily set the service dependencies for SQL Server. Setting service startup dependencies is an essential step to take to help ensure a seamless startup experience and to reduce the chance for failure. Some of the possible failures that could occur were explained in the previous article as well as in this article about MSAs by Wayne Sheffield. Our goal as data professionals is to minimize the chance for surprises and unnecessary time spent troubleshooting problems that shouldn’t have happened in the first place. Set Service Dependencies What is it that a service dependency does for the system? Well, a service dependency is much like any sort of dependency. A service dependency simply means that in order for a service to function properly another service needs to be functioning properly. This is very much like having children. The children are called dependents because children require somebody else to be around to take care of and support them to a certain point. A service that has a dependency means that it basically is a child service that needs a parent service to be properly functioning so the child service can go on about its duties and do what is expected / desired of it. So what are the service dependencies that we should be setting? The services that should be running in order to ensure SQL Server will work properly are Netlogon, W32Time, and KEYISO. For the SQL Agent service, the same services can be set as dependencies but you really only need to ensure that the SQL Server service is listed as a service dependency. Here is an example of what that would look like from the service properties pages in the services control panel. Now, you can either laboriously enter each of those dependencies while editing the registry (ok, so it isn’t really that laborious to do it by hand via regedit but that does more easily permit unwanted errors to occur) or you can take advantage of something that is repeatable and easier to run. A script comes to mind as an appropriate method for that latter option. Script it once! Scripts are awesome resources to make our lives easier. This script is one that I use time and again to quickly set all of these service dependencies. In addition, it also can set the properties for your MSA account. One thing it does not do is set the service to “Automatic (Delayed Start)” instead of the default “Automatic” start type. That sounds like a fantastic opportunity for you to provide feedback on how you would that into the script. Without further ado, here is the script to help save time and set your service dependencies easily. #Todo - modify so can be run against group of servers # modify so can be run remotely $servicein = '' #'MSSQL$DIXNEUFLATIN1' #use single quotes in event service name has a $ like sql named instances $svcaccntname = '' #'svcmg_saecrm01$' #to set managed service account properties #$RequiredServices = @("W32Time","Netlogon","KEYISO"); $RequiredServices = @('W32Time','Netlogon','KEYISO'); #$CurrentServices; IF($servicein){ $ServiceList = [ordered]@{ Name = $servicein} } IF($svcaccntname) { $ServiceList = Get-WmiObject Win32_Service | Select Name, StartName, DisplayName | Where-Object {($_.Name -match 'MSSQL' -or $_.Name -match 'Agent' -or $_.Name -match 'ReportServer') ` -and $_.DisplayName -match 'SQL SERVER' ` -or $_.StartName -like "*$svcaccntname*" } } ELSE{ $ServiceList = Get-WmiObject Win32_Service | Select Name, StartName, DisplayName | Where-Object {($_.Name -match 'MSSQL' -or $_.Name -match 'Agent' -or $_.Name -match 'ReportServer') ` -and $_.DisplayName -match 'SQL SERVER' ` } } foreach ($service in $ServiceList) { $servicename = $service.Name #$RequiredServices = @("W32Time","Netlogon","KEYISO"); #init at top $CurrentReqServices = @(Get-Service -Name $servicename -RequiredServices | Select Name ); #<# if ($CurrentReqServices) { $CurrentReqServices | get-member -MemberType NoteProperty | ForEach-Object { $ReqName = $_.Name; $ReqValue = $CurrentReqServices."$($_.Name)" } "Current Dependencies = $($ReqValue)"; #> } ELSE { "Current Dependencies Do NOT exist!"; $ReqValue = $RequiredServices } $CurrentServices = $RequiredServices + $ReqValue | SELECT -Unique; #"Processing Service: $servicename" #"Combined Dependencies = $($CurrentServices)"; #<# $dependencies = get-itemproperty -path "HKLM:SYSTEMCurrentControlSetServices$servicename" -Name DependOnService -ErrorAction SilentlyContinue if ($servicename -match 'MSSQL'){ if ($dependencies) { #$dependencies.DependOnService Set-ItemProperty -Path "HKLM:SYSTEMCurrentControlSetServices$servicename" -Name DependOnService -Value $CurrentServices } ELSE { New-ItemProperty -Path "HKLM:SYSTEMCurrentControlSetServices$servicename" -Name DependOnService -PropertyType MultiString -Value $CurrentServices } } IF($svcaccntname) { $mgdservice = get-itemproperty -path "HKLM:SYSTEMCurrentControlSetServices$servicename" -Name ServiceAccountManaged -ErrorAction SilentlyContinue if ($mgdservice) { Set-ItemProperty -Path "HKLM:SYSTEMCurrentControlSetServices$servicename" -Name ServiceAccountManaged -Value @("01","00","00","00") } ELSE { New-ItemProperty -Path "HKLM:SYSTEMCurrentControlSetServices$servicename" -Name ServiceAccountManaged -PropertyType BINARY -Value @("01","00","00","00") } } #> } Mandatory disclaimer: Do not run code you find on the internet in your production environment without testing it first. Do not use this code if your vision becomes blurred. Seek medical attention if this code runs longer than four hours. Common side effects include but not limited to: Diarrhea, Infertility, Dizziness, Shortness of breath, Impotence, Drowsiness, Fatigue, Heart issues (palpitations, irregular heartbeats), Hives, Nausea and vomiting, Rash, Imposter Syndrome, FOMO, and seasonal Depression. Script creator and site owner take no responsibility or liability for scripts executed. Put a bow on it DBAs frequently have tasks that must be done in a repeatable fashion. One of those repeatable tasks should be the task to ensure the Service Dependencies are properly set. This article shares a script that achieves the goal of creating a routine that is repeatable and easy to take some of that weight off the shoulders of the DBA. The script provided in this article is an easy means to help ensure consistency and repeatability in tasks that may have to be repeated many times. Doing these tasks with a script is mundane and monotonous enough. Imagine doing that by hand, manually, on hundreds of servers – or even just two servers. Then to try to do it again in 6 months on another server – after you have forgotten what you did manually the first two times. Interested in little more about security? Check these out! Want to learn more about your indexes? Try this index maintenance article or this index size article. This is the fourth article in the 2020 “12 Days of Christmas” series. For the full list of articles, please visit this page. The post Time to Set Service Dependencies for SQL Server, it’s Easy first appeared on SQL RNNR. Related Posts: Here is an Easy Fix for SQL Service Startup Issues… December 28, 2020 CRM Data Source Connection Error January 23, 2020 SHUTDOWN SQL Server December 3, 2018 Single User Mode - Back to Basics May 31, 2018 Changing Default Logs Directory - Back to Basics January 4, 2018 The post Time to Set Service Dependencies for SQL Server, it’s Easy appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 4002

article-image-2020-was-certainly-a-year-on-the-calendar-from-blog-posts-sqlservercentral
Anonymous
30 Dec 2020
1 min read
Save for later

2020 was certainly a year on the calendar from Blog Posts - SQLServerCentral

Anonymous
30 Dec 2020
1 min read
According to my blog post schedule, this is the final post of the year. It’s nothing more than a coincidence, but making it through the worst year in living memory could also be considered a sign. While it’s true that calendars are arbitrary, Western tradition says this is the end of one more cycle, so let’s-> Continue reading 2020 was certainly a year on the calendar The post 2020 was certainly a year on the calendar appeared first on Born SQL. The post 2020 was certainly a year on the calendar appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 3942
article-image-requesting-an-update-for-my-sqlsaturday-com-bid-from-blog-posts-sqlservercentral
Anonymous
29 Dec 2020
1 min read
Save for later

Requesting an Update for My SQLSaturday.com Bid from Blog Posts - SQLServerCentral

Anonymous
29 Dec 2020
1 min read
Someone asked about the bid, and I have had no response, so I sent this. The post Requesting an Update for My SQLSaturday.com Bid appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 3758

article-image-daily-coping-30-dec-2020-from-blog-posts-sqlservercentral
Anonymous
30 Dec 2020
2 min read
Save for later

Daily Coping 30 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
30 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to bring joy to others. Share something which made you laugh. I love comedy, and that is one outing that my wife and I miss. We’ve watched some specials and comedians online, but it’s not the same. I look forward to being able to go back to a live comedy show. One thing that I love about the Internet is the incredible creativity of so many people. While I get that there is a lot of things posted that others may not like, and it’s easy to waste lots of time, it’s also nice to take a brief break and be entertained by something. There is plenty to be offended by, but one of the cleaner, more entertaining things was brought to my by my daughter. She showed me You Suck at Cooking one night while I was cooking. I finished and then ended up spending about 20 minute watching with her while we ate. Two I enjoyed: kale chips potato latkes The post Daily Coping 30 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 3693