Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

Tech News - Databases

233 Articles
article-image-top-3-upgrade-migration-mistakes-from-blog-posts-sqlservercentral
Anonymous
22 Dec 2020
1 min read
Save for later

Top 3 Upgrade & Migration Mistakes from Blog Posts - SQLServerCentral

Anonymous
22 Dec 2020
1 min read
This is a guest post by another friend of Dallas DBAs – Brendan Mason (L|T) Upgrading and migrating databases can be a daunting exercise if it’s not something you practice regularly. As a result, upgrades routinely get dismissed as “not necessary right now” or otherwise put off until there is not really another option. Maybe… The post Top 3 Upgrade & Migration Mistakes appeared first on DallasDBAs.com. The post Top 3 Upgrade & Migration Mistakes appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 944

article-image-end-of-an-era-sql-pass-and-lessons-learned-from-blog-posts-sqlservercentral
Anonymous
22 Dec 2020
7 min read
Save for later

End of an Era – SQL PASS and Lessons learned from Blog Posts - SQLServerCentral

Anonymous
22 Dec 2020
7 min read
Most of my blog is filled with posts related to PASS in some way. Events, various volunteering opportunities, keynote blogging, this or that…With the demise of the organization, I wanted to write one final post but wondered what it could be..I could write about what I think caused it go down, but that horse has been flogged to death and continues to be. I could write about my opinion on how the last stages were handled, but that again is similar. I finally decided I would write about the lessons I’ve learned in my 22 year association with them. This is necessary for me to move on and may be worth reading for those who think similar.There is the common line that PASS is not the #sqlfamily, and that line is currently true. But back in those days, it was. Atleast it was our introduction to the community commonly known as #sqlfamily. So many lessons here are in fact lessons in dealing with and living with community issues. Lesson #1: Networking is important. Seems odd and obvious to say it..but needs to be said. When I was new to PASS I stuck to tech sessions and heading right back to my room when I was done. I was, and I am, every bit the introverted geek who liked her company better than anyone else’s, and kept to it. That didn’t get me very far. I used to frequent the Barnes and Noble behind the Washington convention center in the evenings, to get the ‘people buzz’ out of me – it was here that I met Andy Warren, one of my earliest mentors in the community. Andy explained to me the gains of networking and also introduced a new term ‘functional extrovert’ to me. That is, grow an aspect of my personality that may not be natural but is needed for functional reasons. I worked harder on networking after that, learned to introduce myself to new people and hang out at as many parties and gatherings as I could. It paid off a lot more than tech learning did. Lesson #2: Stay out of crowds and people you don’t belong with. This comes closely on the lines of #1 and may even be a bit of a paradox. But this is true, especially for minorities and sensitive people. There are people we belong with and people we don’t. Networking and attempting to be an extrovert does not mean you sell your self respect and try to fit in everywhere. If people pointedly exclude you in conversations, are disrespectful or stand offish – you don’t belong there. Generally immigrants have to try harder than others to explain themselves and fit in – so this is something that needs to be said for us. Give it a shot and if your gut tells you you don’t belong, leave. Lesson #3: You will be judged and labelled, no matter what. I was one of those people who wanted to stay out of any kind of labelling – just be thought of as a good person who was fair and helpful. But it wasn’t as easy as I thought. Over time factions and groups started to develop in the community. Part of it was fed by politics created by decisions PASS made – quite a lot of it was personal rivalry and jealousy between highly successful people. I formed some opinions based on the information I had (which I would learn later was incomplete and inaccurate), but my opinions cost me some relationships and gave me some labelling. Although this happened about a decade ago, the labels and sourness in some of those relationships persist. Minorities get judged a labelled a lot quicker than others in general, and I was no exception to that.Looking back- I realize that it is not possible to be a friend to everyone, no matter how hard we try. Whatever has happened has happened, we have to learn to move on. Lesson #4: Few people have the full story – so try to hold opinions when there is a controversy. There are backdoor conversations everywhere – but this community has a very high volume of that going on. Very few people have the complete story in face of a controversy. But we are all human, when everyone is sharing opinions we feel pushed to share ours too. A lot of times these can be costly in terms of relationships.I have been shocked, many times, on how poor informed I was when I formed my opinion and later learned the truth of the whole story. I think some of this was fuelled by the highly NDA ridden PASS culture, but I don’t think PASS going away is going to change it. Cliques and back door conversations are going to continue to exist. It is best for us to avoid sharing any opinions unless we are completely sure we know the entire story behind anything. Lesson #5: Volunteering comes with power struggles. I was among the naive who always thought of every fellow volunteer as just a volunteer. It is not that simple. There are hierarchies and people wanting to control each other everywhere. There are many people willing to do the grunt work and expect nothing more, but many others who want to constantly be right, push others around and have it their way. Recognizing such people exist and if possible, staying out of their way is a good idea. Some people also function better if given high level roles than grunt work – so recognizing a person’s skills while assigning volunteer tasks is also a good idea. Lesson #6: Pay attention to burnouts. There is a line of thought that volunteers have no right to expect anything , including thank you or gratitude. As someone who did this a long time and burned out seriously, I disagree. I am not advocating selfishness or manipulative ways of volunteering , but it is important to pay attention to what we are getting out of what we are doing. Feeling thankless and going on for a long time with empty, meaningless feeling in our hearts – can add up to health issues, physical and mental. I believe PASS did not do enough to thank volunteers and I have spoken up many times in this regard. I personally am not a victim of that, especially after the PASSion award. But I have felt that way before it, and I know a lot of people felt that way too. Avoid getting too deep into a potential burnout, it is hard to get out of . And express gratitude and thanks wherever and whenever possible to fellow volunteers. They deserve it and need it. Lesson #6: There is more to it than speaking and organizing events. These are the two most known avenues for volunteering, but there are many more. Blogging on other people’s events, doing podcasts, promoting diversity, contributing to open source efforts like DataSaturdays.com – all of these are volunteering efforts. Make a list and contribute wherever and whenever possible. PASS gave people like me who are not big name speakers many of those opportunites..with it gone it may be harder, but we have to work at it. Lesson #7: Give it time..I think some of the misunderstandings and controversies around PASS come from younger people who didn’t get the gains out of it that folks like me who are older did. Part of it has to do with how dysfunctional and political the organization as well as the community got over time – but some of it has to do with the fact that building a network and a respectable name really takes time. It takes time for people to get to know you as a person of integrity and good values, and as someone worth depending on. Give it time, don’t push the river. Last, but not the least – be a person of integrity. Be someone people can depend on when they need you. Even if we are labelled or end up having wrong opinions in a controversy , our integrity can go a long way in saving our skin. Mine certainly did. Be a person of integrity, and help people. It is , quite literally all there is. Thank you for reading and Happy Holidays. The post End of an Era – SQL PASS and Lessons learned appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 977

article-image-using-azure-durable-functions-with-azure-data-factory-http-long-polling-from-blog-posts-sqlservercentral
Anonymous
21 Dec 2020
5 min read
Save for later

Using Azure Durable Functions with Azure Data Factory - HTTP Long Polling from Blog Posts - SQLServerCentral

Anonymous
21 Dec 2020
5 min read
(2020-Dec-21) While working with Azure Functions that provide a serverless environment to run my computer program code, I’m still struggling to understand how it actually works. Yes, I admit, there is no bravado in my conversation about Function Apps, I really don’t understand what happens behind a scene, when a front-end application submits a request to execute my function code in a cloud environment, and how this request is processed via a durable function framework (starter => orchestrator => activity).  Azure Data Factory provides an interface to execute your Azure Function, and if you wish, then the output result of your function code can be further processed in your Data Factory workflow. The more I work with this couple, the more I trust how a function app can work differently under various Azure Service Plans available for me. The more parallel Azure Function requests I submit from my Data Factory, the more trust I put into my Azure Function App that it will properly and gracefully scale out from “Always Ready instances”, to “Pre-warmed instances”, and to “Maximum instances” available for my Function App. Supported runtime version for PowerShell durable functions, along with data exchange possibilities between orchestrator function and activity function requires a lot of trust too because the latter is still not well documented. My current journey of using Azure Functions in Data Factory has been marked with two milestones so far: Initial overview of what is possible - http://datanrg.blogspot.com/2020/04/using-azure-functions-in-azure-data.html Further advancement to enable long-running function processes and keep data factory from failing - http://datanrg.blogspot.com/2020/10/using-durable-functions-in-azure-data.html Photo by Jesse Dodds on Unsplash Recently I realized that the initially proposed HTTP Polling of long-running function process in a data factory can be simplified even further. An early version (please check the 2nd blog post listed above) suggested that I can execute a durable function orchestrator, which eventually will execute a function activity. Then I would check the status of my function app execution by polling the statusQueryGetUri URI from my data factory pipeline, if its status is not Completed, then I would poll it again.  In reality, the combination of Until loop container along with Wait and Web call activities can just be replaced by a single Web call activity. The reason for this is that simple: when you initially execute your durable Azure Function (even if it will take minutes, hours, or days to finish), it will almost instantly provide you with an execution HTTP status code 202 (Accepted). Then Azure Data Factory Web activity will poll the statusQueryGetUri URI of your Azure Function on its own until the HTTP status code becomes 200 (OK). Web activity will run this step as long as necessary or unless the Azure Function timeout is reached; this can vary for different pricing tiers - https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#timeout The structure of statusQueryGetUri URI is simple: it has a reference to your azure function app along with the execution instance GUID. And how Azure Data Factory polls this URI, is unknown to me, it's all about trust, please see the beginning of this blog post https://<your-function-app>.azurewebsites.net/runtime/webhooks/durabletask/instances/<GUID>?taskHub=DurableFunctionsHub&connection=Storage&code=<code-value> This has been an introduction, now the real blog post begins. Naturally, you can execute multiple instances of your Azure Function at the same time (event-driven processes or front-end parallel execution steps) and the Azure Function App will handle them. My recent work project requirement indicated that when a parallel execution happens, a certain operation still needed to be throttled and artificially sequenced, again, it was a special use case, and it may not happen in your projects. I tried to put such throttling logic inside of my durable azure function activity, however, with many concurrent requests to execute this one particular operation, my function had app used all of the available instances, while the instances were active and running, my function became not available to the existing data factory workflows. There is a good wiki page about Writing Tasks Orchestrators that states, “Code should be non-blocking i.e. no thread sleep or Task.WaitXXX() methods.” So, that was my aha moment to remove the throttling logic from my azure function activity to the data factory. Now, when an instance of my Azure Function finds itself that it can’t proceed further due to other operation running, it completes with HTTP status code 200 (OK), releases the azure function instance, and also provides an additional execution output status that it’s not really “OK” and needs to re-executed. The Until loop container now will handle two types of scenario: HTTP Status Code 200 (OK) and custom output Status "OK", then it exits the loop container and proceeds further with the "Get Function App Output" activity. HTTP Status Code 200 (OK) and custom output Status is not "OK" (you can provide more descriptive info of what your not OK scenario might be), then execution continues with another round of "Call Durable Azure Function" & "Get Current Function Status". This new approach for gracefully handling conflicts in functions required some changes in Azure Function Activity: (1) to run regular operation and completes with the custom "OK" status or identify another running instance, completes the current function instance and proved "Conflict" custom status, (2) Data Factory adjustments to check the custom Status output and decides what to do next. Azure Function HTTP long polling mission was accomplished, however, now it has two layers of HTTP polling: natural webhook status collection and data factory custom logic to check if my webhook received OK status was really OK. The post Using Azure Durable Functions with Azure Data Factory - HTTP Long Polling appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 2021
Banner background image

article-image-daily-coping-21-dec-2020-from-blog-posts-sqlservercentral
Anonymous
21 Dec 2020
2 min read
Save for later

Daily Coping 21 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
21 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to see how many different people you can smile at today. I often wave and smile at people when I drive around my neighborhood. I smile and wave at people in town, especially kids. I don’t seem to meet a lot of people these days, and with masks, I can’t always let someone know I’ve smiled or see if they smile back. I can, however, count my smiles. On a recent day I had to go to the doctor and the grocery before heading home. I smiled at these people, some of whom responded, so I’m guessing they could tell I was smiling under my mask. nurse admitting me into the facility doctor examining me 2nd nurse taking my blood receptionist returning paperwork employee walking into store cashier 3 family members Not a huge total, but I managed to get 9 people today. The post Daily Coping 21 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 854

article-image-estimating-the-storage-savings-by-removing-columns-with-null-value-across-the-table-or-marking-them-as-sparse-from-blog-posts-sqlservercentral
Anonymous
21 Dec 2020
3 min read
Save for later

Estimating the storage savings by removing columns with NULL value across the table or marking them as SPARSE from Blog Posts - SQLServerCentral

Anonymous
21 Dec 2020
3 min read
In the previous article Find columns with NULL values across the table we discussed that storage space can be saved by removing columns with NULL value across the table or marking them as SPARSE. We also learnt about the query to find all such columns across the tables of a database. In this article we’ll learn to estimate the storage saving by taking the necessary action on the columns with NULL value across the table, either by removing them or by marking them as SPARSE. It becomes extremely important to be ready with the relevant data and stats when we propose anything. Similarly, when we’ve to approach the Sr. Leadership for the approvals to take any such actions on the Production database, we need to have the data supporting our claim of storage savings. I found this query very useful. It helped me with the table wise data which we finally aggregated for the total storage savings. This query provides the following columns as the output. TableName : This gives the name of the table TotalColumns : This gives the count of columns in the table with NULL values across. TotalRows: This gives the count of rows of the table Estimated_Savings_Bytes: This gives the estimation of storage savings in bytes. Note: You may find a table tables_with_null_values_across being referred in the query. This is the same table which was created in the previous article. This article is the continuation of Find columns with NULL values across the table. SELECT DV.TableName , COUNT(DISTINCT DV.ColumnName) AS TotalColumns , DV.TotalRows , SUM(DV.TotalRows * CASE WHEN COL.DATA_TYPE IN ('CHAR', 'NCHAR') THEN COL.CHARACTER_OCTET_LENGTH WHEN COL.DATA_TYPE = 'TINYINT' THEN 1 WHEN COL.DATA_TYPE = 'SMALLINT' THEN 2 WHEN COL.DATA_TYPE = 'INT' THEN 4 WHEN COL.DATA_TYPE = 'BIGINT' THEN 8 WHEN COL.DATA_TYPE IN ('NUMERIC', 'DECIMAL') THEN 9 WHEN COL.DATA_TYPE = 'FLOAT' THEN 8 WHEN COL.DATA_TYPE = 'DATE' THEN 3 WHEN COL.DATA_TYPE = 'TIME' THEN 5 WHEN COL.DATA_TYPE = 'SMALLDATETIME' THEN 4 WHEN COL.DATA_TYPE = 'DATETIME' THEN 8 WHEN COL.DATA_TYPE = 'BIT' THEN 1 ELSE 2 END) Estimated_Savings_Bytes FROM tables_with_null_values_across DV WITH (NOLOCK) INNER JOIN INFORMATION_SCHEMA.COLUMNS COL WITH (NOLOCK) ON COL.TABLE_NAME = PARSENAME(DV.TableName, 1) AND COL.COLUMN_NAME = PARSENAME(DV.ColumnName, 1) GROUP BY DV.TableName , DV.TotalRows The post Estimating the storage savings by removing columns with NULL value across the table or marking them as SPARSE appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 822

article-image-the-2021-plan-for-sqlorlando-from-blog-posts-sqlservercentral
Anonymous
20 Dec 2020
2 min read
Save for later

The 2021 Plan for SQLOrlando from Blog Posts - SQLServerCentral

Anonymous
20 Dec 2020
2 min read
SQLOrlando Annual Plan 2021 FinalDownload Over the past couple years we’ve been slowly evolving from a fairly adhoc plan of doing what we did last year to a semi structured plan that was mainly bullet points to a more structured and written out plan for 2021. Writing out a formal (ish) plan supports these goals: Explain clearly to the Board (of SQLOrlando) what we intend to do (and ultimately they can accept or amend the plan – it’s been approved for 2021) Explain clearly to our community what we intend to do – this document is already public (but not announced) It’s our authorization to spend It’s way to work on continuity. If one of us gets tired, distracted, whatever, we have a map of where to go (and an operating manual and Trello Board to support it) Not least, it makes us think about what we want to do. I like having a plan. It’s certainly a less than perfect plan and I’m sure things will change (already have with the end of PASS). It was written without nearly enough community input, something I hope we can improve on next year. I like transparency and for me, this is walking the walk. No reason for secrets here. It’s an ambitious plan for sure and my goal isn’t to say that you need to do what we’re doing in terms of the number or types of events. Borrow ideas if you like them, absolutely, but do the things that excite you, your volunteers, your community. It’s easy (well, sorta) to do more events, but it’s a lot harder to get more people to attend a single event, something we haven’t figured out in Orlando. If you write a plan for your group, public or not, I’d love to see it. More ideas, different implementations, I’ll take those wherever I can find them. The post The 2021 Plan for SQLOrlando appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 930
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-some-post-pass-thoughts-on-local-user-groups-from-blog-posts-sqlservercentral
Anonymous
20 Dec 2020
4 min read
Save for later

Some Post-PASS Thoughts on Local User Groups from Blog Posts - SQLServerCentral

Anonymous
20 Dec 2020
4 min read
For all the groups the most immediate need is to rescue what they can of the mailing list stored at PASS.org and to have a new landing page and/or meeting registration site. The challenge on the mailing list is that you can’t export it (privacy!), so the best you can do is email (multiple times) to give people the new home of your group. We’ve been using Meetup for all our free events and it’s been decent, it’s biggest strength is that people find us far more often than they did by joining PASS (they had to know to join). The downside for a group of more than 50 members is that you have to pay. In Orlando that has been $180 a year, but it looks like that might be increasing. It’s far less capable than Eventbrite for paid events. For us it’s been worth it, but we’re in the same situation as with PASS – if something happens to Meetup, we lose the list. That means either finding something we like better or building out an alternate list (LinkedIn groups are not bad for this) because I really don’t like a single source of failure. Whether you need anything more than Meetup (or whatever equivalent you pick) is really up to you. In Orlando we run sqlorlando.org on WordPress (the hosted version) for about $45 a year (so that we can use our domain). It’s not much to look at it so far, but we’ve budgeted some time and money to work on that this year. It’s important as a non profit to have some place to post by-laws and minutes (Meetup doesn’t let you upload files) and I see some value in whenever I meet someone that is interested in being able to say “just go to sqlorlando.org and you’ll find all the things”. It’s one more thing to pay for and maintain, so it’s definitely optional depending on your goals. To say that differently, I’d say think about will work for your group. You haven’t got a lot of time, but you have some time to decide. Last night I removed the reference to PASS from our site, added a task to remove it from our operating manual when we do the next update, and removed the link/suggestion to join from the announcement email we send when someone joins Meetup. Today I’m going to update our Trello template cards to remove the tasks for emailing the group monthly, posting new events, and closing out completed events. It’s still important to track what’ve we done, so I added a couple more columns to our planning sheet for now. To a degree this simplifies the workflow for us. Planning/Execution Tracking Not to minimize the jump through hoops exercise at the end of the year, but once you figure out where to send people and try to get some of the list to move you’re back to where you were – running your group. Whether it feels that way probably depends on your group identity. Orlando back in 2004 was just a PASS chapter, we did it the way they suggested (more or less) and that was ok. In 2020 we’re a non profit that chose to align with PASS because it aligned with our goals for serving the Orlando tech community. At the monthly meeting level losing PASS just doesn’t change what we do at all (clearly the bigger and more painful impact is losing SQLSaturday.com and I’ll write more on that in the next week or two). None of that removes the emotional impact. We’re still a network, just more loosely coupled than we were – for now. December is a good time for reflecting and planning, so maybe spend an hour or two thinking about how your group can serve your local community next year and write it down given all the changes. In my next post I’ll share what we had planned for next year (though you can see the event list above already) in Orlando and how we’ll be amending that plan. The post Some Post-PASS Thoughts on Local User Groups appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 795

article-image-tf-idf-in-net-for-apache-spark-using-spark-ml-v2-from-blog-posts-sqlservercentral
Anonymous
19 Dec 2020
1 min read
Save for later

TF-IDF in .NET for Apache Spark Using Spark ML v2 from Blog Posts - SQLServerCentral

Anonymous
19 Dec 2020
1 min read
Spark ML in .NET for Apache Spark Apache Spark has had a machine learning API for quite some time and this has been partially implemented in .NET for Apache Spark. In this post we will look at how we can use the Apache Spark ML API from .NET. This is the second version of this post, the first version was written before version 1 of .NET for Apache Spark and there was a vital piece of the implementation missing which meant although we could build the model in . The post TF-IDF in .NET for Apache Spark Using Spark ML v2 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 895

article-image-etl-antipattern-load-processes-that-dont-scale-from-blog-posts-sqlservercentral
Anonymous
19 Dec 2020
1 min read
Save for later

ETL Antipattern: Load Processes that Don’t Scale from Blog Posts - SQLServerCentral

Anonymous
19 Dec 2020
1 min read
One of the most significant design considerations in ETL process development is the volume of data to be processed. Most ETL processes have time constraints that require them to complete their load operations within a given window, and the time required to process data will often dictate the design of the load. One of the more common mistakes I’ve seen... The post ETL Antipattern: Load Processes that Don’t Scale appeared first on Tim Mitchell. The post ETL Antipattern: Load Processes that Don’t Scale appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 797

article-image-placing-my-bid-for-sqlsaturday-com-from-blog-posts-sqlservercentral
Anonymous
18 Dec 2020
1 min read
Save for later

Placing my Bid for SQLSaturday.com from Blog Posts - SQLServerCentral

Anonymous
18 Dec 2020
1 min read
I’m making my bid. I don’t know this brand needs to continue, or that it will, but I’d like to hold this for posterity, at least in an interim basis. Feel free to submit your own bid. I am hoping PASS is willing to return this brand to the founders. The post Placing my Bid for SQLSaturday.com appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 957
article-image-speaker-guidance-save-your-data-from-blog-posts-sqlservercentral
Anonymous
18 Dec 2020
3 min read
Save for later

Speaker Guidance: Save Your Data from Blog Posts - SQLServerCentral

Anonymous
18 Dec 2020
3 min read
The PASS organization (and likely SQL Saturday) are shutting down their sites on 15 Jan,  2021. This means that potentially lots of links and data will disappear. There are people in the community saving some of the data, and images. I’m hoping to put some of those in an archive that will live on and remain accessible, but for speakers, the links to events are part of our living CV. It may be important for you to capture some of this. I’ve got a few things for you to do, especially those of you that are Microsoft (or other) MVPs, or nominated. I have thoughts here on image schedules and your living CV. Note that you might not care about this, or you might not be sure. However, once the data is gone, it’s gone, so you might want to get it now and decide on its value later. Sections here: Image Schedules A Speaking CV Preserving SQL Sat Data Image Schedules I should have done this years ago, as some events (like DevConnections), get rid of their site for particular events and only show the schedule for the next one. As a matter of fact, they’re done. For SQL Saturdays, you can get the schedule for an event on their page. For example, I spoke at SQL Sat 997 – Salt Lake City. I can get the schedule from an image capture: Or at the bottom, there is an export to PDF. I recommend you save these, then upload the image/pdf to someplace you can link to from your CV. Going forward, I’d continue to do this, and even take pictures on a marquee when we get back to live events. Speaking CV I track lots of activity for my Microsoft MVP award, but I also keep a Speaking CV I can point to from my resume. For me, this is a static page on my WordPress blog that I update, but I separate things by year and include the events and links. Typically I link to a schedule page for an event, but these will all be broken, or some of them. One of my tasks this winter will be to go capture some of the visual evidence of these, if I can. That way I can preserve things. One other note is that the first day back to work after I speak, I usually update this and ensure it’s current. Preserving SQL Sat Data For posterity, or maybe just for me, I was curious what activity I’d had. If I log into the PASS site, I can go to My SQL Saturday, and see the events where I submitted things. If you’re curious, go here, and then you know where to go save an image. The footer gets in the way, but you can copy/paste and resort this all if you need to. The post Speaker Guidance: Save Your Data appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 807

article-image-daily-coping-18-dec-2020-from-blog-posts-sqlservercentral
Anonymous
18 Dec 2020
1 min read
Save for later

Daily Coping 18 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
18 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to buy and extra item and donate it to a local food bank. The food bank has always been important to me. I used to volunteer with my kids, helping them understand how lucky they are and how others may not be. Every year, I’ve tried to donate some supplies to either of those close to me, though probably not often enough. This year, with the strange world we live in and fewer trips, I know there are plenty of people in need. I decided to do some extra donations to the local bank. Rather than give them some food items, I purchased some local gift cards and gave them. The food bank uses these for perishable items, like milk,  eggs, etc. I also set a reminder to do this regularly. The post Daily Coping 18 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 801

article-image-sqlsaturday-is-dead-long-live-datasaturdays-from-blog-posts-sqlservercentral
Anonymous
18 Dec 2020
2 min read
Save for later

SQLSaturday is dead, long live DataSaturdays from Blog Posts - SQLServerCentral

Anonymous
18 Dec 2020
2 min read
This is a very brief post to inform you that PASS has died, killed by the for-profit company behind it. That’s sad beyond words, but we, as a community, are not destined to the same fate. The community will rise again and build something new. One of the things that we don’t want to lose is SQLSaturday. It’s been a substantial vehicle for involving community members locally and globally. It has been the launchpad for many community speakers. It has been the opportunity for many people to connect with other community members, share their knowledge and learn something new. Connect, share, learn… that sound familiar, right? We don’t want to take the existing SQL Saturday and give it a new name, we want to start a new community initiative that enables us to continue delivering events. It needs to be a platform that allows us to continue doing what we were doing. Do you want to be involved? Here’s what you can do: Head to datasaturdays.com and have a look. There’s not much content right now, but you have to start from something… Go to GitHub and join the discussion There are many aspects that we need to cover and we know we’re not perfect right now. Please bear with us, we want to improve. The main message here is that we need your help to continue running events for people to share, network and learn. A name a just a name and there’s more that identifies our community. Come and help us, be a part of the solution The post SQLSaturday is dead, long live DataSaturdays appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 854
article-image-workout-wednesdays-for-power-bi-in-2021-from-blog-posts-sqlservercentral
Anonymous
18 Dec 2020
3 min read
Save for later

Workout Wednesdays for Power BI in 2021 from Blog Posts - SQLServerCentral

Anonymous
18 Dec 2020
3 min read
I’m excited to announce that something new is coming to the Power BI community in 2021: Workout Wednesday! Workout Wednesday started in the Tableau community and is expanding to Power BI in the coming year. Workout Wednesdays present challenges to recreate a data-driven visualization as closely as possible. They are designed to help you improve your skills in Power BI and Tableau. How You Can Participate Watch for the Power BI challenge to be published on Wednesdays in 2021. The challenge will contain requirements and a dataset. Use the dataset to create the desired end result. Then share your workout! You can post your workout to the Data Stories Gallery or your blog, or just share a public link. If you aren’t able to share a public link – perhaps because that option is disabled in your Power BI tenant or you don’t have a Power BI tenant– a gif, a video, or even some screenshots are just fine. To formally participate: Post to Twitter using both the #WOW2021 and #PowerBI hashtags along with a link/image/video of your workout. Include a link to the challenge on the Workout Wednesday site. And please note the week number in your description, if possible. Community Growth I’m looking forward Workout Wednesdays for a couple of reasons. First, I think Power BI needs more love in the data visualization department. We need to be talking about effective visualization techniques and mature past ugly pie charts and tacky backgrounds. And I think Workout Wednesdays will help us individually grow those skills, but it will also foster more communication and sharing of ideas around data visualization in Power BI. That in turn will lead to more product enhancement ideas and conversations with the Power BI team, resulting in a better product and a stronger community. Second, I’m also excited to see the crosspollination and cross-platform learning we will achieve by coming together as a data visualization community that isn’t focused on one single tool. There is a lot Tableau practitioners and Power BI practitioners can learn from each other. Join Me In January Keep an eye out on Twitter and the Workout Wednesday website for the first challenge coming January 6. While it would be great if you did the workout for every single week, don’t be concerned if you can’t participate every week. A solution will be posted about a week later, but nothing says you can’t go back and do workouts from previous weeks as your schedule allows. I look forward to seeing all of your lovely Workout Wednesday solutions next year! The post Workout Wednesdays for Power BI in 2021 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 845

article-image-asf-035-alex-yates-interview-part-2-from-blog-posts-sqlservercentral
Anonymous
18 Dec 2020
1 min read
Save for later

ASF 035: Alex Yates interview (part 2) from Blog Posts - SQLServerCentral

Anonymous
18 Dec 2020
1 min read
Alex is a Data Platform MVP who loves DevOps. He’s has been helping data professionals apply DevOps principles to relational database development and deployment since 2010. He’s most proud of helping Skyscanner develop the ability to deploy 95 times a day. Alex has worked with clients on every continent except Antarctica – so he’s keen to meet anyone who researches penguins. Source The post ASF 035: Alex Yates interview (part 2) appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 875