Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-my-experience-on-geeks-on-screen-with-coffee-from-blog-posts-sqlservercentral
Anonymous
24 Nov 2020
2 min read
Save for later

My Experience on Geeks on Screen with Coffee from Blog Posts - SQLServerCentral

Anonymous
24 Nov 2020
2 min read
Last week on appeared on Geeks on Screen with Coffee which broadcast on YouTube and is a different sort of broadcast to be on in good way.  Mark Pryce-Maher tries to get to know his guest more a personal level outside of tech because there is more to everyone besides what they do in tech.  Oh and our debate over Irish tea is quite funny.  We touched on many subjects such as my volunteer work with foster children (see this blog post for more info) and my work as a mental health advocate (see my bio for that).  I don’t think my favorite cheese counted as a cheese in the European eyes but hey I haven’t tried that many cheeses so that’s ok.  And the book he uses for a random question it was hard to come with a question I could answer, four tries.  Below is the YouTube video for anyone interested in watching and seeing a different side on me.  It was a really fun experience. The full playlist can be found here.  He has interviewed some interesting people in the industry.  I mean I know I have to go watch the episode with Buck Woody and Jen Stirrup and many others.  Maybe this is what I binge what over Thanksgiving. The post My Experience on Geeks on Screen with Coffee first appeared on Tracy Boggiano's SQL Server Blog. The post My Experience on Geeks on Screen with Coffee appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 707

article-image-captioning-options-for-your-online-conference-from-blog-posts-sqlservercentral
Anonymous
24 Nov 2020
13 min read
Save for later

Captioning Options for Your Online Conference from Blog Posts - SQLServerCentral

Anonymous
24 Nov 2020
13 min read
Many conferences have moved online this year due to the pandemic, and many attendees are expecting captions on videos (both live and recorded) to help them understand the content. Captions can help people who are hard of hearing, but they also help people who are trying to watch presentations in noisy environments and those who lack good audio setups as they are watching sessions. Conferences arguably should have been providing live captions for the in-person events they previously held. But since captions are finally becoming a wider a topic of concern, I want to discuss how captions work and what to look for when choosing how to caption content for an online conference. There was a lot of information that I wanted to share about captions, and I wanted it to be available in one place. If you don’t have the time or desire to read this post, there is a summary at the bottom. Note: I’m not a professional accessibility specialist. I am a former conference organizer and current speaker who has spent many hours learning about accessibility and looking into options for captioning. I’m writing about captions here to share what I’ve learned with other conference organizers and speakers. Closed Captions, Open Captions, and Subtitles Closed captions provide the option to turn captions on or off while watching a video. They are usually shown at the bottom of the video. Here’s an example of one of my videos on YouTube with closed captions turned on. YouTube video with closed captions turned on. The CC button at the bottom has a red line under it to indicate the captions are on. The placement of the captions may vary based upon the service used and the dimensions of the screen. For instance, if I play this video full screen on my wide screen monitor, the captions cover some of the content instead of being shown below. Open captions are always displayed with the video – there is no option to turn them off. The experience with open captions is somewhat like watching a subtitled foreign film. But despite captions often being referred to colloquially as subtitles, there is a difference between the two. Captions are made for those who are hard of hearing or have auditory processing issues. Captions should include any essential non-speech sound in the video as well as speaker differentiation if there are multiple speakers. Subtitles are made for viewers who can hear and just need the dialogue provided in text form. For online conferences, I would say that closed captions are preferred, so viewers can choose whether or not to show the captions. How Closed Captions Get Created Captions can either be created as a sort of timed transcript that gets added to a pre-recorded video, or they can be done in real time. Live captioning is sometimes called communication access real-time translation (CART). If you are captioning a pre-recorded video, the captions get created as a companion file to your video. There are several formats for caption files, but the most common I have seen are .SRT (SubRip Subtitle), .VTT (Web Video Text Tracks). These are known as simple closed caption formats because they are human readable – showing a timestamp or sequence number and the caption in plain text format with a blank line between each caption. Who Does the Captions There are multiple options for creating captions. The first thing to understand is that captioning is a valuable service and it costs money and/or time. In general, there are 3 broad options for creating captions on pre-recorded video: Authors or conference organizers manually create a caption file Presentation software creates a caption file using AI A third-party service creates a caption file with human transcription, AI, or a combination of both Manually creating a caption file Some video editing applications allow authors to create caption files. For example, Camtasia provides a way to manually add captions or to upload a transcript and sync it to your video. Alternatively, there is a VTT Creator that lets you upload your video, write your captions with the video shown so you get the timing right, and then output your .VTT file. Another approach is to use text-to-speech software to create a transcript of everything said during the presentation and then edit that transcript into a caption file. Services like YouTube offer auto-captioning, so if it’s an option to upload as a private video to get the caption file from there, that is a good start. But you will need to go back through and edit the captions to ensure accuracy with either of these approaches. Vimeo also offers automatic captioning, but the results will also need to be reviewed and edited for accuracy. These are valid approaches when you don’t have other options, but they can be very time consuming and the quality may vary. This might be ok for one short video, but is probably not ideal for a conference. If you are going to make presenters responsible for their own captions, you need to provide them with plenty of time to create the captions and suggest low-cost ways to auto-generate captions. I’ve seen estimates that it can take up to 5 hours for an inexperienced person to create captions for one hour of content. Please be aware of the time commitment you are requesting of your presenters if you put this responsibility on them. Captions in Your Presentation Software Depending on the platform you use, your presentation software might provide AI-driven live captioning services. This is also known as Automatic Speech Recognition (ASR). For example, Teams offers a live caption service. As of today (November 2020), my understanding is that Zoom, GoToMeeting, and GoToWebinar do not offer built-in live caption services. Zoom allows you to let someone type captions or integrate with a 3rd party caption service. Zoom and GoToMeeting/GoToWebinar do offer transcriptions of meeting audio after the fact using an AI service. PowerPoint also offers live captioning via its subtitles feature. My friend Echo made a video and blog post to show the effectiveness of PowerPoint subtitles, which you can view here. There are a couple of things to note before using this PowerPoint feature: It only works while PowerPoint is in presentation mode. If you have demos or need to refer to a document or website, you will lose captions when you open the document or web browser. If you are recording a session, your subtitles will be open subtitles embedded into your video. Viewers will not be able to turn them off. The captions will only capture the audio of the presenter who is running the PowerPoint. Other speakers will not have their voice recorded and will not be included in the captions. Google Slides also offers live captions. The same limitations noted for PowerPoint apply to Google Slides as well. Third-Party Caption Services There are many companies that provide captioning services for both recorded and live sessions. This can be a good route to go to ensure consistency and quality. But all services are not created equal – quality will vary. For recorded sessions, you send them video files and they give you back caption files (.VTT, .SRT, or another caption file format). They generally charge you per minute of content. Some companies offer only AI-generated captions. Others offer AI- or human-generated captions, or AI-generated captions with human review. Humans transcribing your content tends to cost more than AI, but it also tends to have a higher accuracy. But I have seen some impressively accurate AI captions. Captions on recorded content are often less expensive than live captions (CART). Below are a few companies I have come across that offer caption services. This is NOT an endorsement. I’m listing them so you can see examples of their offerings and pricing. Most of them offer volume discount or custom pricing. Otter.ai – offers AI-generated captions for both recorded and live content, bulk import/export, team vocabulary 3PlayMedia – offers AI-generated and human-reviewed captions for recorded content, AI-generated captions for live content. (Their standard pricing is hidden behind a form, but it’s currently $0.60 per minute of live auto-captioning and $2.50 per minute of closed captions for recorded video.) Rev – offers captions for both recorded and live content, shared glossaries and speaker names to improve accuracy. The Described and Captioned Media Program maintains a list of captioning service vendors for your reference. If you have used a caption service for a conference and want to share your opinion to help others, feel free to leave a comment on this post. Questions for Conference Organizers to Ask When Choosing a Captioning Vendor For recorded or live video: What is your pricing model/cost? Do you offer bulk discounts or customized pricing? Where/how will captions be shown in my conference platform? (If it will overlay video content, you need to notify speakers to adjust content to make room for it. But try to avoid this issue where possible.) Is there an accuracy guarantee for the captions? How is accuracy measured? Can I provide a list of names and a glossary of technical terms to help improve the caption accuracy? Does the captioning service support multiple speakers? Does it label speakers’ dialogue to attribute it to the right person? Does the captioning service conform to DCMP or WCAG captioning standards? (Helps ensure quality and usability) How does the captioning service keep my files and information secure (platform security, NDAs, etc.)? What languages does the captioning service support? (Important if your sessions are not all in English) For recorded video: Does my conference platform support closed captions? (If it doesn’t, then open captions encoded into the video will be required.) What file type should captions be delivered in to be added to the conference platform? What is the required lead time for the captioning service to deliver the caption files? How do I get videos to the caption service? For captions on live sessions: Does the live caption service integrate with my conference/webinar platform? How do I get support if something goes wrong? Is there an SLA? What is the expected delay from the time a word is spoken to when it appears to viewers? Further Captioning Advice for Conference Organizers Budget constraints are real, especially if you are a small conference run by volunteers that doesn’t make a profit. Low quality captions can be distracting, but no captions means you have made a decision to exclude people who need captions. Do some research on pricing from various vendors, and ask what discounts are available. You can also consider offering a special sponsorship package where a sponsor can be noted as providing captions for the conference. If you are running a large conference, this should be a line item in your budget. Good captions cost money, but that isn’t an excuse to go without them. If your conference includes both live and recorded sessions, you can find a vendor that does both. You’ll just want to check prices to make sure they work for you. If your budget means you have to go with ASR, make sure to allow time to review and edit closed captions on recorded video. Try to get a sample of the captions from your selected vendor to ensure quality beforehand. If possible for recorded videos, allow speakers to preview the captions to ensure quality. Some of them won’t, but some will. And it’s likely a few errors will have slipped through that can be caught and corrected by the speakers or the organizer team. This is especially important for deeply technical or complex topics. Make sure you have plenty of lead time for recorded videos. If a speaker is a few days late delivering a video, make sure their video can still be captioned and confirm if there is an extra fee. Final Thoughts and Recap If you’d like more information about captions, 3PlayMedia has an Ultimate Guide to Closed Captioning with tons of good info. Feel free to share any tips or tricks you have for captioning conference sessions in the comments. I’ve summarized the info in this post below for quick reference. Terms to Know Closed captions: captions that can be turned on and off by the viewer Open captions: captions that are embedded into the video and cannot be turned off CART: communication access real-time translation, a technical term for live captioning ASR: automatic speech recognition, use of artificial intelligence technology to generate captions .SRT and .VTT: common closed caption file formats Choosing a Captioning Solution for Your Conference (Click to enlarge) This diagram represents general trends and common decision points when choosing a captioning solution. Your specific situation may vary from what is shown here Summary of Caption Solutions Manual creation of caption files for recorded sessionsCost: NoneTime/Effort: HighPros: • Doesn’t require a third-party integration• Supports closed captions• Works no matter what application is shown on the screen• Works not matter what application is used to record and edit videoCons:• Accuracy will vary widely• Manual syntax errors can cause the file to be unusable Upload to YouTube, Vimeo or another service that offers free captionsCost: None to LowTime/Effort: MediumPros:• Supports closed captions• Works no matter what application is shown on the screen• Works no matter what application is used to record and edit videoCons:• Not available for live sessions• Requires editing of captions to achieve acceptable accuracy• Requires an account with the service and (at least temporary) permission to upload the video• Accuracy will vary widely Auto-generated captions in presentation software (e.g., PowerPoint, Google Slides)Cost: LowTime/Effort: LowPros: • Works for live and recorded sessions• No third-party integrations requiredCons:• Requires that all presenters use presentation software with this feature• Must be enabled by the presenter• Won’t work when speaker is showing another application• Often offers only open captions• Accuracy may vary• Often only captures one speaker ASR (AI-generated) captions from captioning serviceCost: MediumTime/Effort: LowPros: • Works for live and recorded sessions• Supports closed captions• Works no matter what application is shown on the screen• Works not matter what application is used to record and edit videoCons: • Accuracy may vary• Requires planning to meet lead times for recorded sessions• Poor viewer experience if delay is too large during live sessions Human-generated or human-reviewed captions from a captioning serviceCost: HighTime/Effort: LowPros: • Ensures the highest quality with the lowest effort from conference organizers and speakers• Works for live and recorded sessions• Works no matter what application is shown on the screen• Works not matter what application is used to record and edit videoCons: • Requires planning to meet lead times for recorded sessions• Poor viewer experience if delay is too large during live sessions I hope you find this exploration of options for captions in online conference content helpful. Let me know in the comments if you have anything to add to this post to help other conference organizers. The post Captioning Options for Your Online Conference appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 969

article-image-daily-coping-24-nov-2020-from-blog-posts-sqlservercentral
Anonymous
24 Nov 2020
2 min read
Save for later

Daily Coping 24 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
24 Nov 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to look at life through someone else’s eyes and see their perspective. In today’s world, in many places, I find that people lack the ability to look at the world through other’s eyes. We’ve lost some empathy and willingness to examine things from another perspective. This is especially true during the pandemic, where politics and frustrations seem to be overwhelming. I have my own views, but I had the chance to hang out with a friend recently. This person sees the world differently than I, but I decided to understand, not argue or complain. In this case, the person talked a bit about why they agreed or disagreed with particular decisions or actions by the state or individuals. I asked questions for clarification or more detail, but allowed this person to educate me on their point of view. It was a good conversation, in a way that’s often lost in the media or in larger groups. I didn’t agree with everything, and did feel there were some emotions overriding logic, but I could understand and appreciate the perspective, even if I disagreed with portions. I like these conversations and I wish we could have more of them in small groups, in a civilized fashion. The post Daily Coping 24 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 652

article-image-virtual-log-files-from-blog-posts-sqlservercentral
Anonymous
24 Nov 2020
4 min read
Save for later

Virtual Log Files from Blog Posts - SQLServerCentral

Anonymous
24 Nov 2020
4 min read
Today’s post is a guest article from a friend of Dallas DBAs, writer, and fantastic DBA Jules Behrens (B|L) One common performance issue that is not well known that should still be on your radar as a DBA is a high number of VLFs. Virtual Log Files are the files SQL Server uses to do the actual work in a SQL log file (MyDatabase_log.LDF). It allocates new VLFs every time the log file grows. Perhaps you’ve already spotted the problem – if the log file is set to grow by a tiny increment, then if your the file ever grows very large, you may end up with thousands of tiny little VLFs, and this can slow down your performance at the database level. Think of it like a room (the log file) filled with boxes (the VLFs). If you just have a few boxes, it is more efficient to figure out where something (a piece of data in the log file) is, than if you have thousands of tiny boxes. (Analogy courtesy of @SQLDork) It is especially evident there is an issue with VLFs when SQL Server takes a long time to recover from a restart. Other symptoms may be slowness with autogrowth, log shipping, replication, and general transactional slowness. Anything that touches the log file, in other words. The best solution is prevention – set your log file to be big enough to handle its transaction load to begin with, and set it to have a sensible growth rate in proportion to its size, and you’ll never see this come up. But sometimes we inherit issues where best practices were not followed, and a high number of VLFs is certainly something to check when doing a health assessment on an unfamiliar environment. The built-in DMV sys.dm_db_log_info is specifically for finding information about the log file, and command DBCC LOGINFO (deprecated) will return a lot of useful information about VLFs as well. There is an excellent script for pulling the count of VLFs that uses DBCC LOGINFO from Kev Riley, on Microsoft Tech Net: https://gallery.technet.microsoft.com/scriptcenter/SQL-Script-to-list-VLF-e6315249 There is also a great script by Steve Rezhener on SQLSolutionsGroup.com that utilizes the view: https://sqlsolutionsgroup.com/capture-sql-server-vlf-information-using-a-dmv/ Either one of these will tell you what you ultimately need to know – if your VLFs are an issue. How many VLFs are too many? There isn’t an industry standard, but for the sake of a starting point, let’s say a tiny log file has 500 VLFs. That is high. A 5GB log file with 200 VLFs, on the other hand, is perfectly acceptable. You’ll likely know a VLF problem when you find it; you’ll run a count on the VLFs and it will return something atrocious like 20,000. (ed – someone at Microsoft support told me about one with 1,000,000 VLFs) If the database is in Simple recovery model and doesn’t see much traffic, this is easy enough to fix. Manually shrink the log file as small as it will go, verify the autogrow is appropriate, and grow it back to its normal size. If the database is in Full recovery model and is in high use, it’s a little more complex. Follow these steps (you may have to do it more than once): Take a transaction log backup . Issue a CHECKPOINT manually. Check the empty space in the transaction log to make sure you have room to shrink it. Shrink the log file as small as it will go. Grow the file back to its normal size. Lather, Rinse, Repeat as needed Now check your VLF counts again, and make sure you are down to a nice low number. Done! Thanks for reading! The post Virtual Log Files appeared first on DallasDBAs.com. The post Virtual Log Files appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 739
Banner background image

article-image-power-bi-hungry-median-from-blog-posts-sqlservercentral
Anonymous
24 Nov 2020
8 min read
Save for later

Power BI – Hungry Median from Blog Posts - SQLServerCentral

Anonymous
24 Nov 2020
8 min read
Introduction Median is a useful statistical function, which first time appeared in SSAS 2016 and in Power BI around that year as well. There are several articles on how to implement the median function in DAX from the time before the native DAX function was introduced. With one client we recently faced an issue when using the implicit median function in Power BI. Size of the dataset was roughly 30mio records. I would say nothing challenging for Power BI or DAX itself. However, the behavior of the median function was not convincing at all. Let’s look at the setup: I created a median dataset based on free data from weather sensors in one city (a link to download at the end of the blog) which has similar data characteristics as our report with the original issue. We have the following attributes: date, hour, location (just numeric ID of location which is fine for our test) and we are monitoring the temperature. We have 35mio records -> 944 unique records for temperature, 422 unique locations, and 24 hours of course. Now we make a simple report – we would like to see the median for temperature per hour despite date or location. Measure: MEASURESenzors[Orig_Med] =MEDIAN(Senzors[temperature]) The following result took 71 seconds to complete on the dataset in PB desktop. And took almost 8GB of memory:: Memory profile during the DAX query: If you try to publish this report to Power BI service, you will get the following message: I was just WOW! But what can I tune on such a simple query and such a simple measure? Tunning 1 – Rewrite Median? I was a bit disappointed about the median function. When we used date for filtering, the performance of the query was ok. But when we used a larger dataset it was not performing at all. I do know nothing about the inner implementation of the median function in DAX but based on memory consumption it seems like if there would be column materialization on the background and sorting when searching for the median. Here’s a bit of theory about median and a bit of fact about columnar storage so we can discover how we can take advantage of the data combination/granularity we have in the model. Below are two median samples for a couple of numbers – when the count of the numbers is Even and when is Odd. More median theory on Wikipedia. The rules for calculating median are the same, even when numbers in the set are repeating (non-unique). Here are the steps of the potential algorithm: Sort existing values. Find the median position(s). Take a value or two and make average to get median. Let’s look at this from the perspective of column store where we have just a couple of values with hundreds of repeats. As we know the count is very fast for column store and that could be our advantage as we have a small number of unique values repeated many times. Following is an example of data where we can visualize the way how we can take advantage of the fact described above. Temperature Count Cumulative Count Cumulative Count Start 12 500 500 1 13 500 1000 501 18 500 1500 1001 20 501 2001 1501 Total Count 2001     Position of median Odd 1001     Position of median Even 1001     In this case, we just need to go through 4 values and find in which interval our position of median belongs. In the worst-case scenario, we will hit between two values like on the following picture (we changed the last count from 501 to 500): Temperature Count Cumulative Count Cumulative Count Start 12 500 500 1 13 500 1000 501 18 500 1500 1001 20 500 2000 1501 Total Count 2000     Position of median Odd 1000     Position of median Even 1001     How to implement this in DAX: First helper measures are count and cumulative count for temperature: MEASURESenzors[RowCount] =COUNTROWS( Senzors ) MEASURESenzors[TemperatureRowCountCumul] =VAR _curentTemp = MAX ( ‘Senzors'[temperature] ) RETURN CALCULATE (        COUNTROWS ( Senzors ), Senzors[temperature] <= _curentTemp) Second and third measures give us a position of the median for given context: MEASURESenzors[MedianPositionEven] =ROUNDUP((COUNTROWS( Senzors ) / 2), 0) MEASURESenzors[MedianPositionOdd] =VAR _cnt =COUNTROWS( Senzors )RETURNROUNDUP(( _cnt / 2), 0)— this is a trick where boolean is auto-casted to int (0 or 1) + ISEVEN( _cnt ) The fourth measure – Calculated median – does what we described in the tables above. Iterate through temperature values and find rows that contain median positions and make average on that row(s). MEASURESenzors[Calc_Med] =— get two possible position of medianVAR _mpe = [MedianPositionEven]VAR _mpeOdd = [MedianPositionOdd]— Make Temperature table in current context with Positions where value starts and finishesVAR _TempMedianTable =ADDCOLUMNS(VALUES(Senzors[temperature]),“MMIN”,[TemperatureRowCountCumul] – [RowCount] + 1,“MMAX”, [TemperatureRowCountCumul])— Filter table to keep only values which contains Median positions in itVAR _T_MedianVals =FILTER( _TempMedianTable,(_mpe >= [MMIN]&& _mpe <= [MMAX])||(_mpeOdd >= [MMIN]&& _mpeOdd <= [MMAX]))— return average of filtered dataset (one or two rows)RETURNAVERAGEX( _T_MedianVals, [temperature]) Maximum number of rows which goes to the final average is 2. Let us see the performance of such measure: Performance for Hour (24 values) Duration (s) Memory Consumed (GB) Native median function 71 8 Custom implementation 6.3 0.2 Sounds reasonable and promising! But not so fast – when the number of values by which we group the data grow, the duration grows as well. Here are some statistics when removing hour (24 values) and bringing location (400+ values) into the table. Performance for location (422 values) Duration (s) Memory Consumed (GB) Native Median Function 81 8 Custom Implementation 107 2.5 Look at the memory consumption profile of calculated median for location below: That is not so good anymore! Our custom implementation is a bit slower for location and despite the fact it is consuming a lot less memory, this will not work in Power BI service as well. This means that we solved just a part of the puzzle – our implementation is working fine only when we have a small number of values that we are grouping by. So, what are the remaining questions to make this report working in PBI service? How to improve the overall duration of the query? How to decrease memory consumption? Tuning 2 – Reduce Memory Consumption We start with the memory consumption part. First, we need to identify which part of the formula is eating so much memory. Actually, it is the same one that has the most performance impact on the query. It’s this formula for the cumulative count, which is evaluated for each row of location multiplied by each value of temperature: MEASURESenzors[TemperatureRowCountCumul] =VAR _curentTemp = MAX ( ‘Senzors'[temperature] ) RETURN CALCULATE (        COUNTROWS ( Senzors ), Senzors[temperature] <= _curentTemp) Is there a different way to get a cumulative count without using CALCULATE? Maybe a more transparent way for the PB engine? Yes, there is! We can remodel the temperature column and define the cumulative sorted approach as a many-to-many relationship towards the sensors. Sample content of temperature tables would look like this: I believe that the picture above is self-describing. As a result of this model, when you use the temperature attribute from the TemperatureMapping table, you have: – Cumulative behavior of RowCount. – Relation calculated in advance. For this new model version, we define measures as below: RowCount measure we have already, but with temperature from Mapping table, it will give us CumulativeCount in fact. MEASURESenzors[RowCount] =COUNTROWS( Senzors ) We must create a new measure which will give us a normal count for the mapping table to be able to calculate the starting position of the temperature value: MEASURESenzors[TemperatureMappingRowCount] =CALCULATE([RowCount],FILTER( TemperatureMapping,TemperatureMapping[LowerTemperature] = TemperatureMapping[temperature])) New median definition: MEASURESenzors[Calc_MedTempMap] =VAR _mpe = [MedianPositionEven]VAR _mpeOdd = [MedianPositionOdd]VAR _TempMedianTable =ADDCOLUMNS(VALUES(TemperatureMapping[temperature]),“MMIN”,[RowCount] – [TemperatureMappingRowCount] + 1,“MMAX”, [RowCount])VAR _T_MedianVals =FILTER( _TempMedianTable,(_mpe >= [MMIN]&& _mpe <= [MMAX])||(_mpeOdd >= [MMIN]&& _mpeOdd <= [MMAX]))RETURNAVERAGEX( _T_MedianVals, [temperature]) Alright, let’s check the performance – the memory consumption is now just in MBs! Performance Many2Many Median Duration (s) Memory Consumed (GB) Used with Hours 2.2 0,02 Used with location 41.1 0,08 I think we can be happy about it and the memory puzzle seems to be solved. You can download a sample PBI file (I decreased data to only one month of the data, but you can download the whole dataset). Below is the statistics summary for now: Performance for Hour (24 values) Duration (s) Memory Consumed (GB) Native median function 71.00 8.00 Custom implementation 6.30 0.20 Many2Many median 2.20 0.02 Performance for Location (422 values) Duration (s) Memory Consumed (GB) Native median function 81.00 8.00 Custom implementation 1 107.00 2.50 Many2Many median 41.10 0.08 I’ll stop this blog here, as it is too long already. Next week, I’ll bring the second part in regards to how to improve performance, so the user has a better experience while using this report. The post Power BI – Hungry Median appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1094

article-image-top-10-must-ask-questions-before-going-to-the-cloud-from-blog-posts-sqlservercentral
Anonymous
23 Nov 2020
1 min read
Save for later

Top 10 Must-Ask Questions Before Going to the Cloud from Blog Posts - SQLServerCentral

Anonymous
23 Nov 2020
1 min read
After architecting and migrating some of the world’s largest data platforms, Let’s walk through my top ten must ask questions before migrating to the cloud so that you can save time, money, and grief. This is a must watch for every CXO that wants to begin your cloud journey with a solid foundation! The post Top 10 Must-Ask Questions Before Going to the Cloud first appeared on Convergence of Data, Cloud, and Infrastructure. The post Top 10 Must-Ask Questions Before Going to the Cloud appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 580
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-markdown-links-remembering-the-basics-from-blog-posts-sqlservercentral
Anonymous
23 Nov 2020
1 min read
Save for later

Markdown Links– Remembering the Basics from Blog Posts - SQLServerCentral

Anonymous
23 Nov 2020
1 min read
For some reason, I can never remember how to do markdown links. I know to use the hash symbol (#) for headings. I know that asterisks do bold, or italics, though I can’t remember one or two without checking. This seems to work for me. Though when I check the markdownguide.org, I see I’ve forgotten underscores can be used. I know numbers and dashes for lists, that’s intuitive. I get that and use it all the time. But I can’t remember how to do links. I know parenthesis and brackets are involved, but I constantly seem to get it wrong. Writing help memory At least for me, writing actually helps my memory. so, I’m writing this on a piece of paper. [text](link) As a matter of fact, I’ll write it a couple times, and type it here. [text](link) [voice of the dba](http://www.voiceofthedba.com) while I’m at it. ![text](image) Maybe now I’ll remember this. The post Markdown Links– Remembering the Basics appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 693

article-image-why-arent-you-automating-database-deployments-from-blog-posts-sqlservercentral
Anonymous
23 Nov 2020
1 min read
Save for later

Why Aren’t You Automating Database Deployments? from Blog Posts - SQLServerCentral

Anonymous
23 Nov 2020
1 min read
Building out processes and mechanisms for automated code deployments and testing can be quite a lot of work and isn’t easy. Now, try the same thing with data, and the challenges just shot through the roof. Anything from the simple fact that you must maintain the persistence of the data to data size to up […] The post Why Aren’t You Automating Database Deployments? appeared first on Grant Fritchey. The post Why Aren’t You Automating Database Deployments? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 700

article-image-daily-coping-23-nov-2020-from-blog-posts-sqlservercentral
Anonymous
23 Nov 2020
2 min read
Save for later

Daily Coping 23 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
23 Nov 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to go outside, play, walk, run, exercise, relax. Last Friday Colorado moved a number of counties to the “red” level for the pandemic. We also added a “purple” level to our dial of risk. I appreciate trying to manage things, while also finding ways for life to move forward. This means that for some people, the level of restriction that exits precludes much ability to leave your house. However, I haven’t seen a complete lockdown anywhere that prevents any leaving. You have your own tolerance for risk, but for me, going outside to walk a little, to take a trip to a store or restaurant (even with contactless delivery/pickup), or even to go say hi to a neighbor from a distance is worth it. There is something about being in nature, even in a city, being outside, looking around, enjoying life away from the four walls of my residence that I like. I’m lucky, on a ranch with the need to go outside regularly, especially if something is broken, but I appreciate the ability to walk outside in cities when I travel, taking a moment to enjoy life outside, watch the world, and walk, run, or even sit with a drink and see the world around me. Recently I’ve been chatting with some friends, and I left my desk to walk outside and continue the conversation there. It helps when you have a couple friends hanging out as well. The post Daily Coping 23 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 647

article-image-differences-between-using-a-load-balanced-service-and-an-ingress-in-kubernetes-from-blog-posts-sqlservercentral
Anonymous
23 Nov 2020
5 min read
Save for later

Differences between using a Load Balanced Service and an Ingress in Kubernetes from Blog Posts - SQLServerCentral

Anonymous
23 Nov 2020
5 min read
What is the difference between using a load balanced service and an ingress to access applications in Kubernetes? Basically, they achieve the same thing. Being able to access an application that’s running in Kubernetes from outside of the cluster, but there are differences! The key difference between the two is that ingress operates at networking layer 7 (the application layer) so routes connections based on http host header or url path. Load balanced services operate at layer 4 (the transport layer) so can load balance arbitrary tcp/udp/sctp services. Ok, that statement doesn’t really clear things up (for me anyway). I’m a practical person by nature…so let’s run through examples of both (running everything in Kubernetes for Docker Desktop). What we’re going to do is spin up two nginx pages that will serve as our applications and then firstly use load balanced services to access them, followed by an ingress. So let’s create two nginx deployments from a custom image (available on the GHCR): – kubectl create deployment nginx-page1 --image=ghcr.io/dbafromthecold/nginx:page1 kubectl create deployment nginx-page2 --image=ghcr.io/dbafromthecold/nginx:page2 And expose those deployments with a load balanced service: – kubectl expose deployment nginx-page1 --type=LoadBalancer --port=8000 --target-port=80 kubectl expose deployment nginx-page2 --type=LoadBalancer --port=9000 --target-port=80 Confirm that the deployments and services have come up successfully: – kubectl get all Ok, now let’s check that the nginx pages are working. As we’ve used a load balanced service in k8s in Docker Desktop they’ll be available as localhost:PORT: – curl localhost:8000 curl localhost:9000 Great! So we’re using the external IP address (local host in this case) and a port number to connect to our applications. Now let’s have a look at using an ingress. First, let’s get rid of those load balanced services: – kubectl delete service nginx-page1 nginx-page2 And create two new cluster IP services: – kubectl expose deployment nginx-page1 --type=ClusterIP --port=8000 --target-port=80 kubectl expose deployment nginx-page2 --type=ClusterIP --port=9000 --target-port=80 So now we have our pods running and two cluster IP services, which aren’t accessible from outside of the cluster: – The services have no external IP so what we need to do is deploy an ingress controller. An ingress controller will provide us with one external IP address, that we can map to a DNS entry. Once the controller is up and running we then use an ingress resources to define routing rules that will map external requests to different services within the cluster. Kubernetes currently supports GCE and nginx controllers, we’re going to use an nginx ingress controller. To spin up the controller run: – kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/controller-v0.40.2/deploy/static/provider/cloud/deploy.yaml We can see the number of resources that’s going to create its own namespace, and to confirm they’re all up and running: – kubectl get all -n ingress-nginx Note the external IP of “localhost” for the ingress-nginx-controller service. Ok, now we can create an ingress to direct traffic to our applications. Here’s an example ingress.yaml file: – apiVersion: networking.k8s.io/v1 kind: Ingress metadata: name: ingress-testwebsite annotations: kubernetes.io/ingress.class: "nginx" spec: rules: - host: www.testwebaddress.com http: paths: - path: /pageone pathType: Prefix backend: service: name: nginx-page1 port: number: 8000 - path: /pagetwo pathType: Prefix backend: service: name: nginx-page2 port: number: 9000 Watch out here. In Kubernetes v1.19 ingress went GA so the apiVersion changed. The yaml above won’t work in any version prior to v1.19. Anyway, the main points in this yaml are: – annotations: kubernetes.io/ingress.class: "nginx" Which makes this ingress resource use our ingress nginx controller. rules: - host: www.testwebaddress.com Which sets the URL we’ll be using to access our applications to http://www.testwebaddress.com - path: /pageone pathType: Prefix backend: service: name: nginx-page1 port: number: 8000 - path: /pagetwo pathType: Prefix backend: service: name: nginx-page2 port: number: 9000 Which routes our requests to the backend cluster IP services depending on the path (e.g. – http://www.testwebaddress.com/pageone will be directed to the nginx-page1 service) You can create the ingress.yaml file manually and then deploy to Kubernetes or just run: – kubectl apply -f https://gist.githubusercontent.com/dbafromthecold/a6805ca732eac278e902bbcf208aef8a/raw/e7e64375c3b1b4d01744c7d8d28c13128c09689e/testnginxingress.yaml Confirm that the ingress is up and running (it’ll take a minute to get an address): – kubectl get ingress N.B. – Ignore the warning (if you get one like in the screen shot above), we’re using the correct API version Finally, we now also need to add an entry for the web address into our hosts file (simulating a DNS entry): – 127.0.0.1 www.testwebaddress.com And now we can browse to the web pages to see the ingress in action! And that’s the differences between using load balanced services or an ingress to connect to applications running in a Kubernetes cluster. The ingress allows us to only use the one external IP address and then route traffic to different backend services whereas with the load balanced services, we would need to use different IP addresses (and ports if configured that way) for each application. Thanks for reading! The post Differences between using a Load Balanced Service and an Ingress in Kubernetes appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 717
article-image-sp_restorescript-1-8-now-released-from-blog-posts-sqlservercentral
Anonymous
23 Nov 2020
1 min read
Save for later

sp_RestoreScript 1.8 Now Released from Blog Posts - SQLServerCentral

Anonymous
23 Nov 2020
1 min read
It looks like there was a bug lurking in sp_RestoreScript that was causing the wrong ALTER DATABASE command to be generated when using @SingleUser and a WITH MOVE parameter. 1.8 fixes this issue. For information and documentation please visit https://sqlundercover.com/2017/06/29/undercover-toolbox-sp_restorescript-a-painless-way-to-generate-sql-server-database-restore-scripts/ The post sp_RestoreScript 1.8 Now Released appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 601

article-image-azure-sql-database-and-memory-from-blog-posts-sqlservercentral
Anonymous
22 Nov 2020
1 min read
Save for later

Azure SQL Database and Memory from Blog Posts - SQLServerCentral

Anonymous
22 Nov 2020
1 min read
There are many factors to consider when you are thinking about the move to Azure SQL Database (PaaS) – this could be single databases (provisioned compute or serverless) to elastic pools. Going through your head should be how many vCores … Continue reading ? The post Azure SQL Database and Memory appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 861

article-image-the-future-of-pass-from-blog-posts-sqlservercentral
Anonymous
21 Nov 2020
3 min read
Save for later

The Future of PASS from Blog Posts - SQLServerCentral

Anonymous
21 Nov 2020
3 min read
2020 has been a tough year for PASS. It’s primary fund raiser – the PASS Summit – was converted to a virtual event that attracted fewer attendees and far less revenue than the in-person version. Going into the event they were projecting a budget shortfall of $1.5 million for the fiscal year ending in June of 2021 and that’s after some cost cutting. My guess is that the net revenue from the Summit will be less than projected in the revised budget, so the shortfall will increase, only partially offset by $1m in reserves. I’m writing all of that based on information on the PASS site and one non-NDA conversation with some Board members during the 2020 PASS Summit. It’s not a happy picture. If things aren’t that dire, I’d be thrilled. I’ll pause here to say this – it doesn’t matter how we got here. The Board has to work the problem they have. When the Board meets in November or December with a final accounting from the Summit, they will have to adjust the budget again and start talking about a 2021 budget. Big questions: How much is the shortfall for 2020 and can we reduce the spend rate enough to make up the difference and have money in the bank to carry through to a prospective 2021 Summit? If we have to reduce staff, which ones? Can we keep the key people that would drive the next in-person event? Can it be a furlough, or is it worse? How much notice can we give them? Does PASS have the option to exit from any contracts around the in-person 2021 Summit right now without penalty, or will that be conditional based on restrictions in place due to Covid? Will event insurance claims cover any of the revenue gap in 2020? Even if a vaccine is being distributed, does PASS bet-it-all on an in-person event in 2021? What’s the minimum attendee number needed to generate net revenue equivalent to the 2020 Virtual Summit and is that number possible? Where could PASS find bridge funding? Government grants, credit line, advances on sponsor fees, cash infusion from Microsoft, selling seats on the Board to a very large company or two, selling off intellectual property (the mailing list, SQLSaturday & groups, maybe the store of recorded content). Note that I’m not saying I like any of those options and there may be others, but the question is one that needs to be asked. What can be done to start marketing the 2021 Summit now? Can we make the decision to go virtual again right now and move on that? What can be done to increase the perceived value of PASS Pro and/or the subscription rate? Should work on that project continue? Is bankruptcy an option that needs to be explored? How much will it cost to retain counsel to get us through that? I’m hoping we’ll get clear and candid messaging from the Board before the end of December on the financial state and go forward plans. As much as I’d like to see public discussion before that’s decided, I don’t think there is time – that’s why I’m writing this. If you’ve got an idea that addresses the core problems, now is the time to share it. The post The Future of PASS appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 717
article-image-notes-on-the-pass-2020-virtual-summit-conclusions-from-blog-posts-sqlservercentral
Anonymous
21 Nov 2020
2 min read
Save for later

Notes on the PASS 2020 Virtual Summit – Conclusions from Blog Posts - SQLServerCentral

Anonymous
21 Nov 2020
2 min read
I waited a week to write this, letting the experience settle some. Looking back, it wasn’t a terrible experience. Content was findable and as far as I could tell delivered without many issues, which is the main goal of any educational event. Networking felt like an after thought poorly executed, not at all a first class experience (perhaps a better word is opportunity, since it’s up to each attendee how much or little effort they put into networking). The sponsor expo made product information accessible, but it could have been so much more and I’d be surprised if the sponsors end up feeling like it was a success for them. For an event put together on the fly due to Covid and facing the comparisons to a long established physical event, it was…ok. Certainly not a fail. Steve Jones grades the event as a “C”, it’s worth reading his analysis. I would guess that most of the people who paid the $699 for the Summit or the $999 for the week see the lower cost as a fair offset for stuff that didn’t translate well to a virtual event. I’d put myself in that group for this year. If they held the Summit in exactly the same way next year, would I attend again? I think I would, because I value the content and the week of largely focusing on wide ranging learning and a somewhat curated experience. Yet, I think I’d do so a bit grudgingly – it’s what I didn’t get that bothers me; hallways conversations, chats over coffee, dinner with friends I see once a year, the sense of taking a break from work and immersing in career, even time walking around just looking at what each sponsor was focusing on for the year. Tough year, tough challenges, but the event did happen and that’s a good thing. I think both the event and the marketing have to be better in 2021. Lots of challenges there too, not least of which is figuring out if you can improve the way you use the platform (or get it improved) or take the bold leap of trying a different platform with all the risks and pain that comes with that. The post Notes on the PASS 2020 Virtual Summit – Conclusions appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 628

article-image-kubernetes-precon-at-dps-from-blog-posts-sqlservercentral
Anonymous
21 Nov 2020
2 min read
Save for later

Kubernetes Precon at DPS from Blog Posts - SQLServerCentral

Anonymous
21 Nov 2020
2 min read
Pre-conference Workshop at Data Platform Virtual Summit 2020 I’m proud to announce that I will be be presenting pre-conference workshop at Data Platform Virtual Summit 2020 split into Two four hour sessions on 30 November and 1 December! This one won’t let you down! Here is the start and stop times in various time zones: Time Zone Start Stop EST 5.00 PM 9 PM CET 11.00 PM 3.00 AM (+1) IST 3.30 AM (+1) 7.30 AM (+1) AEDT 9.00 AM (+1) 1.00 PM (+1) The workshop is “Kubernetes Zero to Hero – Installation, Configuration, and Application Deployment” Abstract: Modern application deployment needs to be fast and consistent to keep up with business objectives, and Kubernetes is quickly becoming the standard for deploying container-based applications fast. In this day-long session, we will start container fundamentals and then get into Kubernetes with an architectural overview of how it manages application state. Then you will learn how to build a cluster. With our cluster up and running, you will learn how to interact with our cluster, common administrative tasks, then wrap up with how to deploy applications and SQL Server. At the end of the session, you will know how to set up a Kubernetes cluster, manage a cluster, deploy applications and databases, and how to keep everything up and running. PS: This class will be recorded, and the registered attendee will get 12 months streaming access to the recorded class. The recordings will be available within 30 days of class completion. Workshop Objectives Introduce Kubernetes Cluster Components Introduce Kubernetes API Objects and Controllers Installing Kubernetes Interacting with your cluster Storing persistent data in Kubernetes Deploying Applications in Kubernetes Deploying SQL Server in Kubernetes High Availability scenarios in Kubernetes Click here to register now! The post Kubernetes Precon at DPS appeared first on Centino Systems Blog. The post Kubernetes Precon at DPS appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 724