Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-sqlpassion-black-friday-deal-2020-from-blog-posts-sqlservercentral
Anonymous
16 Nov 2020
1 min read
Save for later

SQLpassion Black Friday Deal 2020 from Blog Posts - SQLServerCentral

Anonymous
16 Nov 2020
1 min read
(Be sure to checkout the FREE SQLpassion Performance Tuning Training Plan - you get a weekly email packed with all the essential knowledge you need to know about performance tuning on SQL Server.) As we all know the Black Friday approaches quite fast, and therefore I want to offer you a great deal from my side. During the following 2 weeks I will offer you my available Online Trainings with a 60% (!) discounted price: Design, Deploy, and Optimize SQL Server on VMware SQL Server on VMware – Best Practices SQL Server on Linux, Docker, and Kubernetes SQL Server Query Tuning Strategies SQL Server In-Memory Technlogies SQL Server Performance Troubleshooting SQL Server Availability Groups SQL Server Extended Events So, hurry and sign-up for one (or even more) of my Online Trainings! Thanks for your time, -Klaus The post SQLpassion Black Friday Deal 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 937

article-image-what-the-future-of-pass-needs-and-how-i-can-help-from-blog-posts-sqlservercentral
Anonymous
16 Nov 2020
5 min read
Save for later

What the future of PASS needs – and how I can help from Blog Posts - SQLServerCentral

Anonymous
16 Nov 2020
5 min read
At the recent PASS Virtual Summit I and the other candidates took part in a Ask Me Anything (AMA) session – where the community posed us questions. It was a great experience for me – of the 3 days it ran I had time to participate in 2 of them. Current board activities and client requirements didn’t allow me to take part in the final session. It was on reflection of the AMA session that I thought some key things need to be looked at. Namely: — how the community can be engaged in the future of PASS — how we can align as a global team — what the event(s) on 2021 might look like Community EngagementThe Board needs to look to the leaders within our community. We need their help – because to go forward we all need to be on the same page. We need to work cohesively to save the community platform that is PASS. This cannot be a top down approach. We need your help.I have worked in many situations where people have said “it can’t be done” or people are siloed in their approach. One of the reasons I got into DevOps was I realised early on that we needed a goal that could align disparate groups of people. It worked wonders in my corporate career and I’ve excelled in it whilst running a consultancy company.In short – we need to listen to our community, we need to work with that community and take the positive actions and measure the not-so positive actions. Measure them against the goal of PASS – a platform that helps the community learn, grow and connect. AlignmentThe alignment of a global team is related to the above action item. We need to work actively with our Regional Mentors – because they represent our regions. They should be actively working with their community leaders and actively telling us the board what works and doesn’t. We need this alignment, we need this communication and we need people who are passionate about making a difference for their community. I appreciate these are volunteer positions, that it can take a bit of personal time – however – time can be allocated if something is important enough and I found helping a community to be somewhere people felt safe, could learn and become a better person. I was recognised by the community in 2019 for being relentless when it came to helping others: https://www.pass.org/About-PASS/PASS-News/2019-passion-award-winner In my life I have a pragmatic approach to disparity – one of the my early mantras was “make stuff go” and when placed in situations where groups of people wouldn’t agree on action items – I’d measure our opinions on whether it made stuff go or not. It is a very simple thing to do, the hard part is actually listening and distilling down to the fundamental problems at hand. Not up-talking or complicating things for our own ego…. That is what is needed for both items 1 & 2 above. I can help here, I have a proven corporate record in doing that. Basically this is about running out of $$…We need to take a pragmatic approach to the survival of PASS – and events may be smaller and more regional in 2021 – purely because of the covid situation we are experiencing right now. In saying that the Virtual Summit that we’ve just had was better than I personally thought it would be. I’m interested to see what the number/percentages look like for attendee experience. Certainly compared to other events I’ve been on it was better, yes there are improvements to be made/had. There were aspects that I didn’t like – but we have a voice and constructive feedback is far better than just saying something is bad. It will be interesting to see what the balance sheet looks like when all things are done – I feel that some hard decisions have to be made to ensure the longevity of the platform known as PASS. Some of those decisions will involve how we engage with the community, how we manage events as well as what strategic decisions we can make around content delivery. Not easy at all, but my passion for our community means I’m ready to stand up and fight for what PASS really needs to be, PASS has changed my life – dramatically. I used to be a mediocre DBA. I thought locally rather than globally and the personal growth I have experienced by meeting passionate mentors has made me a far better person. It has resulted in my own mentoring of people and also having a wider sense of community – not just a data platform community – but helping others: https://techcommunity.microsoft.com/t5/humans-of-it-blog/guest-blog-how-mvps-can-be-champions-for-diversity-and-inclusion/ba-p/824013 This is why I want PASS to continue – things have to change and I want to be a positive participant of that journey. Please vote here: https://pass.simplyvoting.com/?user=pass&language=en&og=112896 My other blog post around why I am running is here:Why I am running for the PASS Board in 2020 #Yip. The post What the future of PASS needs – and how I can help appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 664

article-image-pass-summit-2020-my-experience-from-blog-posts-sqlservercentral
Anonymous
15 Nov 2020
8 min read
Save for later

PASS Summit 2020 – My experience from Blog Posts - SQLServerCentral

Anonymous
15 Nov 2020
8 min read
This was the first year in 16 years for me that there has been a fall season without the PASS Summit to go to. Every year, during the fall, it has been a soul sustaining practice to pack up and head to the summit (typically, Seattle) and spend a week with friends – learning, sharing one another’s stories and just enjoying the warmth and comfort of being with people who care. This year, thanks to Covid, that was not to be. We had a virtual summit instead and most of us were skeptical how this is going to work out.For me personally, this year has been THE most challenging of my adult life. Health issues, losses in my immediate family due to covid and just the emotional stress of the hermit like existence with no relief in sight was beginning to get to me. The virtual summit came as a welcome relief. Below is my experience.PRECONS: I signed up for two precons – Day 1 on Powershell with Rob Sewell, and Day 2 on Execution Plan Deep Dive with Hugo Kornelis. Both were excellent in terms of quality and rendering. BUT, I lost 3 hours of Rob’s precon because he was on a different timezone (i knew this when i signed up but thought the recordings would make up)..and Hugo’s was packed with so much content that a rewatch (or multiple rewatches) would have helped. But the recording was limited to the thursday of the summit. Am not sure how anyone thought people attending the summit would find time to re watch anything in this painfully short interval. I certainly could not and my learning was limited to the first attendance. I will definitely reconsider putting $$ down on a precon if this is the way it continues to be done.SUMMIT DAY 1It felt unusual/odd to start a day without a keynote but I got used to it and attended two great classes in the morning – Raoul Illayas on ‘Data Modernization, how to run a successful modernization project’, followed by David Klee’s class on ’10 Cloudy questions to ask before migrating your sql server’. Both classes were excellent with great Q&A and well moderated. This was followed by the afternoon keynote. It started with my friend and co chapter lead Tracy Boggiano winning the passion award – which was a heart warming moment. The Passion Award made a huge, positive impact on my life and I was a bit sad she did not get to experience it in person. But she did say a few words and it was received really well by everyone in the community. This was followed by several microsofties demoing cool product features – lead by Rohan Kumar. It was fun and interesting. I attended two sessions in the afternoon ‘What is new in sql server tools’ by Vicky Harp and her team at Microsoft, followed by ‘Making a real cloud transformation, not just a migration’ by Greg Low. Both were outstanding classes. In the evening I had to moderate a Birds-of-a-feather bubble on Mentoring – I had a good chat with a few friends who showed up, and made a couple of new friends as well. Overall it was a worthy day of learning and networking with few real glitches to worry about. SUMMIT DAY 2I started this day by attempting to reach out to a few friends – these are people I see in person and not on social media. I sent them a message via messaging option, but did not hear back. I was disappointed with how this worked. I was able to catch up on a few friends accidentally – because they were in the same class or same network bubble, but intentionally reaching out was really hard and did not seem to work very well.I also visited a few vendor rooms online – vendors were the reason the virtual summit is possible. Vendor visits are always a big part of my in person summit attendance so wanted to make sure they were thanked. I got good responses for my visit at Red Gate and Sentry One. Some of the other vendors did not care to respond very much (maybe they did not have anyone online).I also attended two classes ‘Normalization beyond third normal form’ with Hugo Kornelis and ‘Azure SQL: Path to an Intelligent Database’ with Joe Sack and Pedro Lopez. Both classes were outstanding in content. The afternoon’s keynote was again from microsoft – it seemed to have content but was a bit dry and difficult to follow along. I think given how difficult this year was in general, we can forgive Microsoft this. The absolute highlight of Day 2 , to me, was the Diversity in Data panel discussion that I was part of – with some amazing women – Tracy Boggiano, Hope Foley, Anna Hoffman, Jess Pomfret and DeNisha Malone. I have been on a few panels but this was truly well moderated by Rebecca Ferguson from PASS HQ, well attended by a number of people in the virtual summit including several of my own colleagues. It was a true honor to do it. SUMMIT DAY 3The last day arrived, albeit too soon. I logged in early, attended a few sessions (‘Execution plans, where do I start’ by Hugo Kornelis, ‘Getting Started with Powershell as a DBA’ by Ben Miller). The ‘Diversity, Equity and Inclusion’ keynote by Bari Miller was amazing – planning to revisit this and make more notes/probably a separate blog post. In the afternoon attended ‘Splitting up your work in chunks’ by Erland Sommarksog, and ‘the good, bad and ugly of migrating sql server to public clouds’ by Allan Hirt. Half way through Allan’s class(which was outstanding as always) I started to feel really tired/brain fried and wanted a break. There was a networking session open for late evening – so hopped on it and had a lovely chat with many friends i could not see in person. That ended a very fun week.Following are what worked well and what didn’t: POSITIVES 1 The platform was relatively easy to navigate – finding a class was really easy.2 Chat rooms in classes were a lot of fun and good places to find friends unexpectedly as well. All the sessions I attended personally, except a few on friday were very well moderated.3 PASS HQ and BoD members were freely available and it was really easy to find and have chats with them if one desired throughout the week.4 Replaying sessions were a really good bonus treat – I wish the same was true for precons as well.5 Having the recordings available on an immediate basis is absolutely great – watching a few this weekend and able to catch what I missed in the class. This would be very hard to do if we had to wait a couple of months for the recordings. NEGATIVES1 I am not sure how any vendor would make any gains out of this. Traffic in vendor rooms, when i visited, seemed low and in some cases vendor reps did not even bother answering chat messages. (I don’t blame them if the traffic was not much).2 Transcribing/sub titling was a total mess. Granted, it made for a lot of very fun moments in many classes but the purpose of it is to help hearing impaired, don’t think it lived up to that cause at all.3 Pre cons – esp the ones given by experts – are – to me not worth it with replay option gone in such a short time. I would have appreciated if i had the weekend to replay both my precons but I didn’t.4 Finding individuals to chat up was insanely hard. This is especially true of people not on social media. I was very disappointed that I never heard back from many people to whom i sent messages via the platform. I don’t think they knew where to check this, I certainly didn’t. I had to largely depend on people to be there on social media or to show up in group chats (many did). Overall, it was a worthy experience to me and especially uplifting in a time like this. I am hoping that for the next year PASS could do a blend of live and online classes and we can all make the best of both worlds. Thanks to everyone for attending and supporting the organization. The post PASS Summit 2020 – My experience appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 716

article-image-notes-on-the-2020-pass-virtual-summit-part-6-from-blog-posts-sqlservercentral
Anonymous
14 Nov 2020
4 min read
Save for later

Notes on the 2020 PASS Virtual Summit – Part 6 from Blog Posts - SQLServerCentral

Anonymous
14 Nov 2020
4 min read
Mostly Friday notes here. This morning I was the moderator for Continuous Integration with Local Agents and Azure DevOps by Steve Jones. The way the Summit worked this year is that the speaker, moderator, and support staff would all join a Zoom call and then the Zoom call was streamed back to Cadmium and out to users. I also had a private link to the streamed session so that I could monitor the questions and discussions. I started out by joining Zoom and the public stream, wanted to see the delay first hand. It seemed to be around 30 seconds, but I understand that can vary up to 90 seconds. I only checked a couple times. In that mode I could see the questions but not answer them. It was also – as expected – a confusing experience, because on Zoom Steve was 30+ seconds ahead of what was playing in the public stream. So, I closed that, joined back with the private link and muted the audio. After that it was watching the question tab while listening to the presentation on Zoom. The technical experience questions were handled by the support team, so I only had to watch for the technical ones. We ended up with two batches, one about half way and toward the end. The first question was a good one, but I had not seen the deck ahead of time and decided to hold it for a slide to see if it would get answered in the flow. Almost, so then just a matter of waiting for a place to interrupt, without interrupting! I tried to do the following, with mixed success: Pose the question as “Andy asked…” if I thought I could get the name right. Mute Try to capture a bit of the verbal answer and type that in to the question answer. In one case I was able to find a link to the page he was looking at. Mark as answered Go back and mark any technical question as a favorite (if there had been a 100 questions, maybe this wouldn’t have been doable) Repeat, keeping an eye on the time He ended up with a couple mins to spare at the end after answering all of the questions, so I asked a couple of my own to clarify a couple points and that carried it just a bit over the end time (which is ok). I checked the discussion tab a few times, but short of a big problem assumed that was attendees chatting and shouldn’t need moderation. The UI for this was just OK. One tab for questions with an option to see all or unanswered, and then the discussion tab. You can go back and forth or set them up side by side. When you answer a question there is not an option to mark as favorite then, you have to go through the all questions list to mark it. Annoying, not huge. Tried Discord, didn’t seem to land in the right place, then got interrupted and didn’t get back to it. Someone out there, write something about that experience! A friend mentioned that she wished she had thought to write and publish notes from the beginning and I’m noting that here because we should encourage that. Finding topics to write about is hard and the Summit offers a lot of topics. More than that though, it’s a way to think about and share what works, what’s fun, what you noticed or appreciated that maybe no one else did (and it’s a way to share with those not able to attend). I’d love to see some kind of contest next year related to blogging. I finished up the day in the community zone, pleased to see old friends and have a chance to talk about this and that. A few miscellaneous notes based on some stuff we talked about: Classroom layouts in general are not great for networking Background music and getting everyone a drink (even if it’s soda) tends to encourage the walk and talk networking (like the opening night reception) Holiday events are good (if hard to do this year) Lots of value in having events that are just Q&A, or just a meal together That finishes up the week. This weekend I’m going to read back over my notes and think about the week before writing a final wrap up post for Monday. The post Notes on the 2020 PASS Virtual Summit – Part 6 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 631
Banner background image

article-image-daily-coping-13-nov-2020-from-blog-posts-sqlservercentral
Anonymous
13 Nov 2020
1 min read
Save for later

Daily Coping 13 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
13 Nov 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to be creative. Cook, draw, write, paint, make, or inspire. I like creativity. It’s a part of my job and my life, as I often need to think about how to write. I’m also a little creative in my life, as I cook most of the meals, fix things, play guitar, and sometimes actually build something interesting. This week, I’ve started going back through my guitar courses and practicing. Today, to relax during the PASS Virtual Summit, I’ve kept a guitar handy and just mindlessly played some riffs over and over as I watched some talks. I’m also hoping to get “Lovely Day” good enough that I can play it tonight for my wife. She loves that song. The post Daily Coping 13 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 606

article-image-notes-on-the-2020-pass-virtual-summit-part-5-from-blog-posts-sqlservercentral
Anonymous
13 Nov 2020
4 min read
Save for later

Notes on the 2020 PASS Virtual Summit – Part 5 from Blog Posts - SQLServerCentral

Anonymous
13 Nov 2020
4 min read
Thursday evening, writing a few more notes. Felt like I struggled more to stay engaged and attentive today, which is above average for day 4 of a 5 day event, for me at least. About 3 pm I went for a long walk and listened to a presentation, that worked out well, it was stuff I mostly knew and didn’t need to see the slides. I used the phone app for that and it worked fine. Some general feedback about presenting, I think that it’s easy to spend too much time up front on the ‘me’ slide, the ‘PASS’ slide, and whatever else. I’d be interested to see what everyone thinks, but for me that’s about a max of 2 minutes. The question tab for the presentation tab is just ok. I saw at least one session where the answer was “answered”, presumably sometime during the presentation. That’s not very helpful to attendees (and moderators can mark questions as answered to help sort through the list). Better would be a quick sentence of two to summarize the answer, really superb might be that and/or a time mark in the presentation (go to 2 mins 15 to see more on the answer to this question, that kind of thing). I haven’t gone back to look, but for sessions that were pre-recorded in particular I wonder if the final recording will somehow contain the questions? The phone app has an ‘activity feed’ section, though on my phone it looks more like ‘Activity fee’. Mildly funny. Doesn’t get seem to be much used. Not being able to set a password really annoys me now. The phone app logged me out for whatever reason, so I had to go look up the password in the email they sent. There’s a session evaluation button on each “room”, and that’s good, but I consistently forget to do it while I’m there and then have to go back and fill them out later (which means I do fewer). Some samples below, the eval screenshot is about 50% of the total questions you can answer. Tim Ford talked a bit about financials at the start of the lunch keynote today and it was ok. Most of the key parts were there, but it would be easy to not read entirely between the lines. I appreciate trying to remain positive and highlight moves and decisions that have been made, but I’d have asked and hoped for something far more direct. I don’t see that video up as public anywhere yet. I went to office hours for the Board around 4 pm, it’s the same video chat used in the network bubbles. Grant & Lori were there and there was some casual conversation and some good questions. I like it as a less formal and less stressful way for the Board to interact, but I do like and miss the whole Board being in front of an audience taking questions at the same time. For those who read my post from yesterday about the state of PASS financials, I’d say there is no good news. Summit registrations and PASS Pro signups are both well under the goals in the budget. They should have a final accounting within a couple weeks. I’ll leave it to the Board to share the exact numbers. I didn’t do birds of a feather or really interact much with anyone today. Part introvert, part that it’s just not the same as seeing people and pausing for a few minutes of conversation. I hear that Jen and Sean have a discord channel set up, maybe tomorrow I’ll try to find that. As far as content, there has been a good variety on the schedule and no major technical issues that I’ve seen. The networking piece is just harder to do well and to be fair I don’t have a vision for how to do it better. Maybe one suggestion is that attendees could set up their own times to chat and have it show up on the schedule sometime “Chat with Andy at 4pm” or whatever. Tomorrow is the last day and the only new thing to try I have planned is moderating a session for Steve at 1030 I think. The post Notes on the 2020 PASS Virtual Summit – Part 5 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 568
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-notes-on-the-2020-pass-virtual-summit-part-4-from-blog-posts-sqlservercentral
Anonymous
13 Nov 2020
2 min read
Save for later

Notes on the 2020 PASS Virtual Summit – Part 4 from Blog Posts - SQLServerCentral

Anonymous
13 Nov 2020
2 min read
Last night (Tuesday) was the main reception and then “networking bubbles”. The reception consisted of DJ Leanne playing assorted music (and seeming to have a good time doing so) with a chat window for attendees and a bartender/mixer (mixologist?) making a few variations of a SQL Sipper .Reading that, it probably feels underwhelming and I admit that was my first impression too, but then the chat window just started scrolling. Someone called it ‘speed Twitter’ and that’s close to it, people saying “hi” and not much in the way of long conversations, but for me (not a Summit first timer) it somehow absolutely captured what’s it like in-person to walk into a room with music and a hundred or more people you know. Part two were the networking bubbles, maybe 20 different music themed video chat rooms. I tried a couple, some had the music and some didn’t, and said hello to a few people. Not a great browsing experience, you had to exit a room, click on whichever one you wanted to go to next, then launch the browser again. I didn’t see a fast way to just go from room to room to room. It also seems like most rooms had 3 or 4 people, was hard to find out where (or if) the crowd was. Not terrible, but not quite compelling either, for me at least. Today was more like a usual Summit day, attending sessions and getting interrupted for work stuff a couple times. It was nice to be able to pause a session (even the live ones). The recordings of live sessions are not available the same day, there are some from today I would have liked to watch tonight. The keynote was at lunchtime and went fairly well. Rohan talked about being in a studio versus on stage and in what was a nice touch, showed the studio with the cameras and the teleprompter. The rest was kind of the standard keynote, maybe most interesting was that sometime in the future query store will gain the ability to add query hints. Not much exploring today. It does seem like the phone app is slightly different in a few places, so need to look at that more. The post Notes on the 2020 PASS Virtual Summit – Part 4 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 588

article-image-daily-coping-12-nov-2020-from-blog-posts-sqlservercentral
Anonymous
12 Nov 2020
1 min read
Save for later

Daily Coping 12 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
12 Nov 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to give yourself a boost. Try a new way of being physically active. My weekly activities are usually yoga, weight lifting, and swimming. I may use the rowing machine or bike, but those are less frequent. Horseback riding rarely, but skiing is usually something I start thinking about this time of year as a 1-2x /wk activity. In trying something new, I decided to add in some walking. I stopped running a few years ago, but with being sick this fall, I need something that works me, but isn’t too taxing. Something I can also include the dogs in, so walking it is. I’m adding in a walk a week, or hoping to, and trying to get some different type of movement, especially as I hope to travel in 2021 somewhere and do more walking/hiking. The post Daily Coping 12 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 607

article-image-a-quick-and-dirty-scan-of-a-list-of-instances-using-a-dynamic-linked-server-from-blog-posts-sqlservercentral
Anonymous
12 Nov 2020
5 min read
Save for later

A quick and dirty scan of a list of instances using a dynamic linked server. from Blog Posts - SQLServerCentral

Anonymous
12 Nov 2020
5 min read
Note: This is not exactly a dyanmic linked server. It just gets dropped and recreated in a loop. I recently did a post on testing a linked server where I said I would explain why I wanted to make the test. Basically I needed to scan a few hundred instance names and do the following Check if the instance is one we have access to or even exists? If not make a note of the error so we can tell the difference. Collect information like instance size (total size of all databases), CPU count, memory count etc. Collect a list of database names on the instance, their status, size, etc. So the first thing I did was throw the list of instance names into a cursor and then put the code from my last post inside the loop. -- Create MyLinkedServer using the current server so that it exists and -- the code will compile. EXEC master.dbo.sp_addlinkedserver @server = N'MyLinkedServer', @srvproduct=N'', @provider=N'SQLNCLI', @Datasrc = @@SERVERNAME; GO -- Create temp table to hold instance names. -- You'll probably want a permanent table. CREATE TABLE #InstanceList (InstanceName NVARCHAR(256)); INSERT INTO #InstanceList VALUES ('InstanceA'), ('InstanceB'), ('InstanceC'); -- Declare vars DECLARE @sql NVARCHAR(max); DECLARE @InstanceName NVARCHAR(256); -- setup cursor to loop through servers DECLARE InstList CURSOR FOR SELECT InstanceName FROM #InstanceList; OPEN InstList FETCH NEXT FROM InstList INTO @InstanceName WHILE @@FETCH_STATUS = 0 BEGIN BEGIN TRY EXEC master.dbo.sp_addlinkedserver @server = N'MyLinkedServer', @srvproduct=N'', @provider=N'SQLNCLI', @Datasrc = @InstanceName; -- Test the linked server. EXEC sp_testlinkedserver @server = N'MyLinkedServer' EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'MyLinkedServer',@useself=N'True',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL END TRY BEGIN CATCH INSERT INTO dbo.[LinkedServerLog] VALUES ( @InstanceName ,ERROR_NUMBER() ,ERROR_SEVERITY() ,ERROR_STATE() ,ERROR_PROCEDURE() ,ERROR_LINE() ,ERROR_MESSAGE()); END CATCH FETCH NEXT FROM InstList into @InstanceName; END CLOSE InstList; DEALLOCATE InstList; -- Cleanup IF EXISTS (SELECT * FROM sys.servers WHERE name = 'MyLinkedServer') EXEC master.dbo.sp_dropserver @server=N'MyLinkedServer', @droplogins='droplogins'; I now have a piece of code that loops through a list of instances and creates a linked server for each one. It then tests that linked server to make sure I can connect and if I can’t store the error into an error table. From there I can see which instances I couldn’t connect to and which I could connect but couldn’t log into. Now, all I have to do is add code into the try block that uses the linked server to collect information. -- Create MyLinkedServer using the current server so that it exists and -- the code will compile. EXEC master.dbo.sp_addlinkedserver @server = N'MyLinkedServer', @srvproduct=N'', @provider=N'SQLNCLI', @Datasrc = @@SERVERNAME; GO -- Create temp table to hold instance names. -- You'll probably want a permanent table. CREATE TABLE #InstanceList (InstanceName NVARCHAR(256)); INSERT INTO #InstanceList VALUES ('InstanceA'), ('InstanceB'), ('InstanceC'); -- Create temp table to hold database sizes. CREATE TABLE #DBList ( InstanceName NVARCHAR(256), DatabaseName NVARCHAR(256), DatabaseSize DECIMAL(17,5) ); -- Declare vars DECLARE @sql NVARCHAR(max); DECLARE @InstanceName NVARCHAR(256); -- setup cursor to loop through servers DECLARE InstList CURSOR FOR SELECT InstanceName FROM #InstanceList; OPEN InstList FETCH NEXT FROM InstList INTO @InstanceName WHILE @@FETCH_STATUS = 0 BEGIN BEGIN TRY IF EXISTS (SELECT * FROM sys.servers WHERE name = 'MyLinkedServer') EXEC master.dbo.sp_dropserver @server=N'MyLinkedServer', @droplogins='droplogins' EXEC master.dbo.sp_addlinkedserver @server = N'MyLinkedServer', @srvproduct=N'', @provider=N'SQLNCLI', @Datasrc = @InstanceName; -- Test the linked server. EXEC sp_testlinkedserver @server = N'MyLinkedServer' EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'MyLinkedServer',@useself=N'True',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL INSERT INTO #DBList SELECT @InstanceName, dbs.name, SUM(size)/128/1024.0 FROM MyLinkedServer.master.sys.databases dbs JOIN MyLinkedServer.master.sys.master_files dbfiles ON dbs.database_id = dbfiles.database_id GROUP BY dbs.Name, dbs.database_id; END TRY BEGIN CATCH INSERT INTO dbo.[LinkedServerLog] VALUES ( @InstanceName ,ERROR_NUMBER() ,ERROR_SEVERITY() ,ERROR_STATE() ,ERROR_PROCEDURE() ,ERROR_LINE() ,ERROR_MESSAGE()); END CATCH FETCH NEXT FROM InstList into @InstanceName; END CLOSE InstList; DEALLOCATE InstList; -- Cleanup IF EXISTS (SELECT * FROM sys.servers WHERE name = 'MyLinkedServer') EXEC master.dbo.sp_dropserver @server=N'MyLinkedServer', @droplogins='droplogins' ; A couple of notes here. I have a piece of code at the top that adds MyLinkedServer to make sure it exists when the code starts. Otherwise it won’t compile. Also, because of the connection time for the test, particularly if you have a bunch of instances that you can’t log into/don’t exist, this script is going to take a while just to handle the loop. Make sure that the data collection code is hitting system tables and/or is as quick as you can make it. Like I said in the title, this is pretty quick and dirty. This is the kind of thing you throw together because your manager wants some data collected from a bunch of sources ASAP and T-SQL is by far your best language. There are a lot of better ways to handle this. The post A quick and dirty scan of a list of instances using a dynamic linked server. appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 681

article-image-serious-thoughts-about-pass-from-blog-posts-sqlservercentral
Anonymous
12 Nov 2020
4 min read
Save for later

Serious Thoughts About PASS from Blog Posts - SQLServerCentral

Anonymous
12 Nov 2020
4 min read
Back when Covid started I guessed that this might be yet another live-or-die challenge for PASS. PASS is heavily dependent on Summit revenue and uses it to fund all the other stuff. Looking at the most recent budget notes (see the PASS blog) there’s a forecasted shortfall of $1.5m and that’s if they hit their attendance targets for the Summit and the sign up goal for PASS Pro. Maybe they can make that up from the more or less $1m in reserves plus a bit of cash on hand plus some more cuts. Maybe there will be an insurance claim for revenue loss that might supplement that, though such claims are typically very slow to process. Maybe attendance will exceed the goal and things will work out. If they hit the goals in the budget, that provides enough money to get through the fiscal year ending June 2021 (or maybe more, I’m not a finance guy – go look at the revised budget). If they come up short, then deeper cuts will be the least of it, in the worst case PASS might have to file bankruptcy. I’m writing all of that based on some experience being on the Board quite a while ago, no insider knowledge of what’s going on now, and reading what has been posted to the PASS blog/news and in the Board minutes. If things are better than I think, well, that would be welcome news. I suspect for many it’s tempting to throw rocks. But now isn’t the time for that. From the day Covid hit I think the Board has tried hard to figure out what to do and when to do it. Deciding to go virtual was hard for a lot of reasons: They had 2020 venue contracts to think about and figure out It would mean a substantial cut to the budget Never done it before, so a huge risk You can argue about the decisions, the platform, lack of transparency, the implementation, whatever else, but that’s all water under the bridge now. We can post-mortem later. Soon after the Summit completes they should have a really good view of the budget and at that point they can make some decisions, perhaps really tough ones. I’ve written all of the above to get you to think about the next part. The Board is dealing with all of this during a time where they are all under stress at home and work. Covid has impacted everyone. More than that though, they fear failing. Wayne Snyder used to tell me something along the lines of “I don’t want PASS to fail on my watch” and I think every Board member feels that obligation. Max stress to be sure. The other part is our full time staff. Yes, they technically work for C&C, but many of them have been with us for years and all do the work we ask them to do. Imagine what it’s like to see these events, know the challenges, and wonder if you’ll have a job for much longer. Max stress, plus. Right now there isn’t much you and I can do except recognize that there are a couple of dozen people living with a lot of stress and trying to get PASS back on a solid footing. Next year, after the immediate decisions have been made, we can and should talk about if PASS is all we want it to be, but for now we have to wait and see. These are our people and they need whatever support and kind words we can muster. The post Serious Thoughts About PASS appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 680
article-image-vote-in-the-pass-election-from-blog-posts-sqlservercentral
Anonymous
12 Nov 2020
1 min read
Save for later

Vote in the PASS Election from Blog Posts - SQLServerCentral

Anonymous
12 Nov 2020
1 min read
If you login to pass.org AND have previously met the criteria to vote this year, you’ll see this. Click the button and you’ll be off to the voting page where you can pick three candidates from the slate. I haven’t endorsed anyone this year and I’m not going to announce my choices either. Read the applications, view the scores from the Nominating Committee, and if you can read through the questions and answers on the Twitter AMA. The post Vote in the PASS Election appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 668

article-image-configure-sql-server-2019-multi-subnet-alwayson-setup-part-12-from-blog-posts-sqlservercentral
Anonymous
12 Nov 2020
1 min read
Save for later

Configure SQL Server 2019 Multi-Subnet AlwaysOn Setup - Part 12 from Blog Posts - SQLServerCentral

Anonymous
12 Nov 2020
1 min read
The post Configure SQL Server 2019 Multi-Subnet AlwaysOn Setup - Part 12 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 570

article-image-pass-summit-2020-precon-scripts-and-links-from-blog-posts-sqlservercentral
Anonymous
11 Nov 2020
5 min read
Save for later

PASS Summit 2020 Precon – Scripts and Links from Blog Posts - SQLServerCentral

Anonymous
11 Nov 2020
5 min read
The PASS Virtual Summit 2020 conference is live this week, and I’m thrilled to have presented a preconference boot camp yesterday called “Amplify Your Virtual SQL Server Performance“. During the precon, one of the attendees in my session asked a really good question. “Do you have a list of all the scripts and links that I reference in the presentation?” The answer is not yesterday, but I do now! The following list of scripts and links are designed to help you know more about the tools and techniques in how I use them to improve the diagnostics and testing side of performance tuning. Microsoft DiskSpd DiskSpd is a synthetic disk benchmarking utility. A spiritual successor to the SQLIO utility, I use it to simulate certain SQL Server-like I/O patterns on different storage subsystems so I can see how certain storage devices respond to various disk access patterns. The scripts that is used to generate a random workload for evaluation are as follows. 80/20 Read/Write test, 100GB workload file, 4KB disk access, 4 worker threads and 8 operations per thread – to simulate activity but not trying to overwhelm the storage 80/20 Read/Write test, 100GB workload file, 64KB disk access, 8 worker threads and 8 operations per thread – to simulate a high demand I/O workload pattern DVDStore Database Benchmarking Utility HammerDB is not the only database benchmarking utility in town. The DVDStore benchmarking suite is useful for repetitive database benchmarking tests where I can script it out and run it via a PowerShell/Batch file to make for very quick database activity. I have a walk-through for how to use this tool out at my company site here. The command that I used to simulate the moderate SQL Server load is as follows. Tuning Cost Threshold of Parallelism Jonathan Kehayias has a great query out at SQLskills that I use to help identify any and all commands in the execution plan cache that have either gone parallel or stay single threaded. With derivatives of this query, I’m able to find key queries that have a query cost to replay and validate where the Cost Threshold of Parallelism setting should be dialed in at for a given SQL Server workload. DBATools PowerShell Module It goes without saying how important the DBATools PowerShell module is to all DBAs around the world. I used this to help deploy and manage all of the SQL Servers that I demoed during my precon. If you’ve not yet explored it, I implore you to dig in and learn how you can use this tool to make your life better.  Perfmon Setup for 24×7 Collection Windows Perfmon is a fantastic means to collect and record ongoing performance metrics from your key SQL Servers. I’ve got a PDF and video set here to show you how to set up Perfmon for ongoing 24×7 collection. Import Perfmon Data into a SQL Server Database In addition to collecting it, we now need to load it into a database so that we can start to mine the data and extract the meaning from the data. We’ve released a PowerShell module at GitHub that will help you extract the BLG file and load the data into a SQL Server database, as well as sample mining queries to start to access key metrics within the data. We also have a video here that will show you how to use it and extract meaning from the data. CPU-Z CPU-Z is a wonderful free tool to help you identify all the key components and specifications of your processors underneath your most critical servers.  Geekbench Geekbench is my go-to choice for performing relative CPU performance comparisons. It’s per-core benchmarking metrics help me understand the performance differences that you can expect when moving to new hardware, either on-premises or in the cloud. The best part is that the database of results is searchable so you can mine the info to see the performance of other CPUs that you might be interested in moving to. iperf Iperf is a great free utility to help you performance network throughput testing. You download a copy, deploy it to multiple servers, and then set up one as a server and one as the test client. More details on how to use this utility is found here. CoreInfo (Microsoft Sysinternals) The CoreInfo utility from the Microsoft Sysinternals suite is a great utility to help you determine the in-guest CPU topology, as seen and used by your SQL Server workloads. NUMA, hyperthreading, and alignment tasks can be validated through the use of this free utility. SQL Server Disk Stall Collector SQL Server keeps track of the ‘stall’ rates of how quickly it can access the database data and log files on the file system. These are different metrics than what your underlying operating system tracks, as the disk stall rates can show indications of metadata contention inside SQL Server, even if the underlying operating system doesn’t see it. DMVs inside SQL Server do track this information, and while aggregate information is usually what is stored, the law of averages might mean that you lose the importance of the data. This script, documented on my other site, mines this information and helps you track this information over time so you can understand if you have hot-spots during the day or night that are causing your databases to run slow. I really hope that these scripts and utilities helps you test, validate, and stress your infrastructure underneath your critical SQL Servers so that you can help to have the best possible platform for your data! Enjoy the rest of the PASS Summit conference, and do not hesitate to reach out if you have any additional questions on the content in my precon!  The post PASS Summit 2020 Precon - Scripts and Links first appeared on Convergence of Data, Cloud, and Infrastructure. The post PASS Summit 2020 Precon – Scripts and Links appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 692
article-image-what-data-collaboration-looks-like-and-how-to-achieve-it-from-whats-new
Anonymous
11 Nov 2020
8 min read
Save for later

What data collaboration looks like and how to achieve it from What's New

Anonymous
11 Nov 2020
8 min read
Forbes BrandVoice Tanna Solberg November 11, 2020 - 5:17pm November 11, 2020 Editor's note: This article originally appeared in Forbes. Data is inexhaustible—there’s more of it than we could ever imagine or use. But in important ways, it’s just like conventional raw material in that its value is derived from what becomes of it—not the individual data points themselves. A data stockpile, for instance, is of little interest or use.  It’s why Matthew Miller, product management senior director at Tableau, urges people not to assume that every pretty data dashboard yields useful results. “As much as we love data, and we love insights, insight alone doesn’t transform organizations,” he said. “And no one is measured on how many dashboards they’ve looked at, or how many dashboards they produce, or how many petabytes are in their data warehouse. It’s about driving organizational performance.” By putting people in the right circumstances—with the right tools and access to the right data—organizations can do amazing things. That was clear in the early months of the Covid-19 response, when organizations with people who could make informed decisions, despite uncertainty and rapidly changing circumstances, had an advantage. Even as organizations compressed years of digital transformation roadmaps into the span of just a few weeks or months, they were able to make choices that outmaneuvered competitors in the most crucial moments. Leaders can build on those lessons by laying the cultural foundation for productive and valuable collaboration around data. It’s there, and not in data lakes or stylized visuals, that the real coin is minted. To understand data collaboration, think about caller ID In the beginning of wired phone service, phone companies kept meticulous databases of customers and phone numbers. They used this data internally to bill customers, route calls and provide value-added services, including phone books and operator-assisted number lookup. Monthly bills would typically list every number called over the span of a month in case that look-back analysis might be of use. Then caller ID came onto the scene. Caller ID didn’t create any new data. It simply presented existing data to a user in a timely fashion at a decision point: Tell me who is calling so I can decide if I want to answer. Telephone users didn’t have to make a big change to their workflow to use this information—today it appears on a phone’s screen as a matter of course. Nobody has to push additional buttons or perform a special data lookup. The end user gets a valuable piece of information at the precise moment they have reason to ask. This outcome should be a goal of every organization seeking to instill a productive data culture, said Richard Starnes, principal in Deloitte’s analytic and cognitive offering, who consults with Deloitte clients on analytics and business-intelligence solutions.  “Turning into a data-driven organization means you have got to figure out a way to get that data unencumbered into the hands of the people that can be creative and effective with it,” he said. Hallmarks of data harmony Getting data into the hands of people who will know exactly what to do with it starts with building sensible workflows. And for an organization to use data persuasively, the output may come closer to resembling a snapshot than a complex workbook. Here are three traits that are found within data-driven companies: A Virtuous Cycle Of Input And Output. Miller recommends boiling down the analytical cycle to a repeatable process: A piece of information provokes an action, which kick-starts a process, which produces a new piece of data, which provokes an action. Analytics processes that don’t fit this mold are possible and can be valuable, but if those are the rule rather than the exception, it may be a sign that low-hanging fruit is being left unpicked. Data Workflows That Reflect How People Already Work. The best data-driven processes should support and enhance the jobs and responsibilities of the people engaging with them, and simplify the typical problems they are trying to solve on a daily basis. “Human patterns of collaboration illuminate how to design data systems for harmony,” said David Gibbons, senior director for analytics at Salesforce. “The shape of the data can sometimes help you to understand who your team is going to connect and interact with in order to complete their work more effectively. And a flexible data platform that lets you embed analytics wherever they are needed in the middle of those collaborations will maximize success and increase data harmony in the process.” Insights That Are Easy To Consume. Instead of using complex, multi-tab workbooks to express key findings, some organizations are producing simple dashboards that are easily consumable for every level of understanding. They’re also taking advantage of features that allow them to track key metrics, kind of like how you would track stocks in an investment portfolio. And for those who are primarily focused on answering “what should I do with this information?”, artificial intelligence can help translate complex data into immediate next steps. “Not every business user wants to see the full data set. AI features are now built into BI [business intelligence] platforms, so users can get specific recommendations to help them make faster decisions. This results in benefits like closing deals faster or resolving customer cases with higher satisfaction,” Gibbons said.  Leader’s checklist: 4 steps toward data success Data success comes more easily if senior leaders make it a priority and show in their own work how analytics should be used to drive business outcomes. 1. Share and leverage the knowledge of others One of the obstacles to effective collaboration around data is grading yourself against your peers. In other fields, it’s easy to rate and clone processes that lead to low manufacturing defect rates or efficient supply chain execution. Data is more slippery, and so the sources and skill sets that serve one team or company best might not work the same way at another.  Overcome these challenges by ensuring that your data collaboration efforts are as inclusive as possible, bringing collaborators into the fold early to discuss and improve processes and outcomes. Starnes said IT is typically best positioned to support the bottom-up work of making data platforms efficient, effective and credible, while business leadership can work from the top down to put the money and strategic support behind the last mile of data delivery and collaboration. 2. Cater to a wide range of knowledge and talent Giving every employee access to seemingly limitless data sources and analytical tools won’t turn them all into equally effective knowledge workers. Although that approach may serve trained data scientists and can help data savants bubble up to the surface, it’s not the most effective or coordinated way to collaborate.  “There are plenty of organizations that have failed despite having massive data warehouses and big analytics investments,” Miller said. Instead, ask your talented employees to articulate the problems they need data to solve, and have the experts focus on ways to help them.  3. Create spaces where contributors can seek advice Instead of putting more training hours on people’s calendars, create spaces that encourage employees to ask questions. These can be virtual spaces for teams, or drop-in clinics, of sorts, with a rotating cast of collaborators who can bring a wide range of skills to the table. Whatever platform is used to create the space where a data community can convene, it’s crucial that it’s flexible. As we learned from this year’s sudden shift away from commuting, business travel and office usage, data consumption trends and analytical needs will change. In 2019, trends emphasized pushing more data to smaller screens and mobile devices. But over the past several months, the share of data consumed at desktop screens has climbed considerably.  4. Make data easy to question and validate Many dramatic stories of data and analysis focus on a game-changing realization or an unanticipated surprise. The real world is more prosaic. The truth is that data is frequently used to confirm well-founded intuitions and assumptions.  “Most executives have a gut sense of how they’re doing, and when they see the analytics, they aren’t often all that surprised,” Miller said. “So if they see a number and wonder where it came from, you need to be able to track it back to the source.” One effective way to ensure that happens is to put a human face on every piece of data and analysis exchanged. You can do this by certifying data sources—effectively putting a mark of approval to show the data is up-to-date and trustworthy. And there should be cultural support and incentives for providing timely responses and explanations, so that decisions aren’t stalled and insights aren’t discarded by users who don’t have time to wait for the explanation. Discovering the crucial facts and most valuable insights is a collaborative process.  Ask contributors how data and analysis played a role in a recent success. Discuss the most-loved and least-loved data experiences and search for common threads.  Visit Tableau.com to learn how to empower more people with data and explore stories of data-driven collaboration.
Read more
  • 0
  • 0
  • 763

article-image-deploy-ssrs-projects-with-two-new-powershell-commands-from-blog-posts-sqlservercentral
Anonymous
11 Nov 2020
3 min read
Save for later

Deploy SSRS Projects with Two New PowerShell Commands from Blog Posts - SQLServerCentral

Anonymous
11 Nov 2020
3 min read
I built two new PowerShell commands to deploy SSRS projects, and they have finally been merged into the ReportingServicesTools module. The commands are Get-RsDeploymentConfig & Publish-RsProject. While the Write-RsFolderContent command did already exist, and is very useful, it does not support deploying the objects in your SSRS Project to multiple different folders in your report server. These two new commands can handle deployment to multiple folders. The concept is fairly simple, first you run the Get-RsDeploymentConfigcommand to pull in all the deployment-target details from the SSRS project file. In SSRS projects you can have multiple deployment configurations, so you can specify which configuration you want to use by supplying the name of that configuration for the -ConfigurationToUse parameter. This will give you back a PSObject with all the info it collected. After that, you need to add the URL of the report portal manually (unfortunately, these are not included in the SSRS Project config files). You can put all of that together and see the results like this: $RSConfig = Get-RsDeploymentConfig –RsProjectFile ‘C:sourcereposFinancial ReportsSSRS_FRSSRS_FR.rptproj‘ –ConfigurationToUse Dev01 $RSConfig | Add-Member –PassThru –MemberType NoteProperty –Name ReportPortal –Value ‘http://localhost/PBIRSportal/‘ $RSConfig Once that looks good to you, all you have to do is pipe that object to the Publish-RsProject command, and your deployment should start. $RSConfig | Publish-RsProject Some quick notes: Obviously, the account running these commands will need a copy of the SSRS project it can point to, as well as the necessary credentials to deploy to the SSRS/PRIRS server you point it to. For the Get-RsDeploymentConfig command, the SSRS project you are using must be in the VS 2019 project format. Otherwise, the command won’t know where to look for the correct info. If you don’t know the name of the configuration you want to use, just point Get-RsDeploymentConfig to the project file, and it will give you back a list of configuration options to choose from. Make sure you run Update-Module ReportingServicesToolsto get these new commands. FYI: I only had two SSRS projects available to test these commands with. They worked great for those two projects, but your SSRS project might include some complexities that I just didn’t have in either of the projects I tested with. If you have any trouble making this work, please give me a shout or file a bug on the GitHub project and I will try to help out. Big thanks to Doug Finke ( t ) for his code contributions, and Mike Lawell ( t ) for his help testing, to make these two commands a reality. The post Deploy SSRS Projects with Two New PowerShell Commands first appeared on SQLvariations: SQL Server, a little PowerShell, maybe some Power BI. The post Deploy SSRS Projects with Two New PowerShell Commands appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1175