Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

Tech News - Databases

233 Articles
Anonymous
11 Dec 2020
5 min read
Save for later

Closure with the Professional Organization for SQL Server from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
5 min read
I know that this isn’t the correct name, though the by-laws still list this as the corporation. Perhaps this is one more sign of the failure to evolve and grow that I’ve felt from the organization. Three directors of the PASS Board of Directors resigned this week. Mindy Curnutt, longtime member, volunteer, and advocate for the community was first, with Melody Zacharias and Hamish Watson following suit. You can read their open letters (Mindy, Melody, Hamish), which appear to hint at some sort of disagreement, argument, ethical or moral failure, or maybe just anger. I have no idea what happened, I’m curious as to what it could be, but I also think this might be something better left to fade away. I’ve had my disagreements with the organization in the past, and I certainly think the culture and governance of the executive board is broken. In the aftermath of outcries from various prominent members of the community, Grant Fritchey left a note that there are legal issues as to what is happening now. I do not know if these are financial/debt issues or something else, and I am not speculating on what these are. I do appreciate Grant’s engagement with the community, and in my memory, since Kevin Kline, he has been one of the very, very few to actually engage with the community on controversial issues. I haven’t always agreed with him, but I respected and appreciated the effort. However, most directors that have served on the executive committee, which includes the Executive Director from C&C, release very little information. Updates take place relatively rarely and little is proposed or discussed with members publicly. There is no law or legal liability that would prevent an announcement of acknowledgement of the resignation of directors or a news release that thanks Mindy, Melody, and Hamish for their service. No penalty for noting they have resigned. Here, I’ll make it easy for you on Twitter: @SQLPASS: Today Mindy Curnutt resigned from the PASS Board of Directors. We thank Mindy for her many years of service and wish her well in future endeavors. @SQLPASS: Today Melody Zacharias resigned from the PASS Board of Directors. We thank Melody for her many years of service and wish her well in future endeavors. @SQLPASS: Today Hamish Watson resigned from the PASS Board of Directors. We thank Hamish for his many years of service and wish him well in future endeavors. No copyright here, feel free to cut and paste. It would be even easier to drop these three notes on the https://www.pass.org/About-PASS/PASS-News page, because, well, this is news. Instead, we have the same lack of engagement, trust, respect, and leadership  that have permeated the culture for well over a decade. No accountability to the membership, or the board of directors, that I can see. Whether this is the appointed members (President, VP-Finance, VP-Marketing), and/or the executive director, they operate indepedently of the community and the board. This has been my primary complaint, and I suspect also, the complaint from many in the SQLFamily community. I see no reason for a large organization to exist primarily to run a profitable conference that pays salaries and bonuses to an organization that are de facto employees, with management that doesn’t seek to be a part of the community. As I write that, I’m saddened, mostly for the employees of C&C. Over the years I have had many opportunities to work with Marcella, Craig, Anika, Leeza, Erick, Audrey, and likely others I am forgetting. They have worked hard to ensure events have run smoothly, and I’ve appreciated their help and assistance in various matters. They have been a part of the community and I hope they continue to do so. I’m saddened that they may find themselves cast aside if the organization fails. I do hope they receive proper notice and compensation if this is the case. My one regret in all of this is that SQL Saturday is inextricably bound up in the legal mechanisms of PASS. Andy Warren, Brian Knight, and I gifted this to the organization, trusting they would be good stewards of the events. They have been, and I know that these events will continue, either under this moniker or another. Our community is too strong to let these lapse. We will find a way for these to continue, whether with PASS or not. I hope Microsoft continues to support community events and organizations, but I do not hope they provide any more assistance to the PASS organization. The lack of governance and transparency along with the poor culture of the executive committee and management company in engaging with the community, lead me to the conclusion this is not the place to invest and engage with a community. The organization has helped a strong community grow over the years, but it has outlived its usefulness. It is time for an evolution to something new that better exists to serve the community rather than the organization itself. I continue to support SQL Saturday events and chapters regardless of affiliation. The post Closure with the Professional Organization for SQL Server appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 821

Anonymous
11 Dec 2020
4 min read
Save for later

Find columns with NULL values across the table from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
4 min read
Recently, I was working on one of the performance tuning assignment with an esteemed client. The size of their databases were growing tremendously. Growth in the database size is directly proportionate to disk space requirement. More the database size will be, more disk space will be required. When we talk about the disk space for Production, it’s not only specific to one server. There are multiple factors that needs to be considered such as Primary server, Secondary servers, Backup servers and Backups (Full, Differential, Log backups). Long time ago, I blogged about similar topic. Read the full article here. While we were discussing the various possibilities to reduce the database size, client brought up a very interesting point in front of me. He said “My database has many huge tables (with size beyond 500 GB’s). Most of them have multiple columns which has null values across the table.” Although, client was aware that there are many such tables and columns but there was not definite list that client has. This has opened a whole new arena for me and gave a unique insights. I started thinking to build some logic to fetch the list of all such columns and tables. You will be amazed to hear but it’s true. There were more than 4K such columns belonging to 500+ tables. We estimated that we could save TB’s of disk space by marking all such columns as SPARSE. More on SPARSE columns can be read here at Microsoft official documentation. In this article we’ll see how to get the list of columns which has null values across all the rows of the table. Note: Running the query for all the tables together would take lot of time. It is recommended to run the query for set of tables in batches parallely in multiple instances of SSMS or with the help of jobs. In the next article, we’ll talk about how to estimate the storage savings by removing such columns or marking them as SPARSE. IF NOT EXISTS (SELECT 1 FROM sys.tables WHERE name = 'tables_with_null_values_across') BEGIN CREATE TABLE tables_with_null_values_across ( TableName VARCHAR(200) , TotalRows BIGINT , ColumnName VARCHAR(200) ) END /* DROP TABLE IF EXISTS #nullable_columns; DROP TABLE IF EXISTS #tables_with_nullable_columns; */ SET NOCOUNT ON SELECT A.object_id , B.name AS TableName , A.name AS ColumnName , ROW_NUMBER() OVER(PARTITION BY A.object_id ORDER BY A.name ASC) AS RowID INTO #nullable_columns FROM sys.columns A INNER JOIN sys.tables B ON B.object_id = A.object_id AND B.type = 'U' INNER JOIN tables_with_same_values_across C ON C.TableName = QUOTENAME(B.name) AND C.ColumnName = QUOTENAME(A.Name) LEFT JOIN tables_with_null_values_across D ON D.TableName = C.TableName AND D.ColumnName = C.ColumnName WHERE A.is_nullable = 1 AND D.TableName IS NULL -- AND B.name IN ('', '') -- Note: Supply the table names in the filter clause for B.name in order to run the query in batches of tables. SELECT DISTINCT A.object_id , B.name AS TableName , IDENTITY(INT, 1, 1) AS RowID INTO #tables_with_nullable_columns FROM #nullable_columns A INNER JOIN sys.tables B ON B.object_id = A.object_id DECLARE @TableName AS SYSNAME , @ColumnName AS SYSNAME , @Object_ID AS INT , @Table_RowID AS INT , @Column_RowID AS INT , @Column_Distinct_Values_Count AS BIGINT , @Total_Rows AS BIGINT SET @Table_RowID = 1; WHILE EXISTS (SELECT 1 FROM #tables_with_nullable_columns WHERE RowID = @Table_RowID) BEGIN SELECT @Object_ID = object_id , @TableName = TableName FROM #tables_with_nullable_columns WHERE RowID = @Table_RowID; SET @Column_RowID = 1; WHILE EXISTS (SELECT 1 FROM #nullable_columns WHERE object_id = @Object_ID AND RowID = @Column_RowID) BEGIN SELECT @ColumnName = ColumnName FROM #nullable_columns WHERE object_id = @Object_ID AND RowID = @Column_RowID; DECLARE @SQLString NVARCHAR(500); SET @SQLString = N'SELECT @Column_Distinct_Values_Count = COUNT(DISTINCT ' + QUOTENAME(@ColumnName) + ') , @Total_Rows = COUNT(1) FROM ' + QUOTENAME(@TableName) + ' WITH (NOLOCK)'; BEGIN TRY EXECUTE sp_executesql @SQLString , N'@Total_Rows BIGINT OUTPUT, @Column_Distinct_Values_Count BIGINT OUTPUT' , @Total_Rows = @Total_Rows OUTPUT , @Column_Distinct_Values_Count = @Column_Distinct_Values_Count OUTPUT ; END TRY BEGIN CATCH END CATCH IF (@Column_Distinct_Values_Count = 0) BEGIN INSERT INTO tables_with_null_values_across (TableName, TotalRows, ColumnName) VALUES (QUOTENAME(@TableName), @Total_Rows, QUOTENAME(@ColumnName)) END SET @Column_RowID = @Column_RowID + 1; END SET @Table_RowID = @Table_RowID + 1; END DROP TABLE IF EXISTS #nullable_columns; DROP TABLE IF EXISTS #tables_with_nullable_columns; The post Find columns with NULL values across the table appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 777

Anonymous
11 Dec 2020
5 min read
Save for later

Is On-premises SQL Server Still Relevant? from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
5 min read
Unequivocally, yes on-premises SQL Server Instances are still relevant. While I’m a firm believer that the cloud is not a fad and is not going away, it’s just an extension of a tool that we are already familiar with.  The Microsoft marketing slogan is “It’s just SQL” and for the most part that is indeed true.  However, that does not mean that every workload will benefit from being in the cloud.  There are scenarios where it does not make sense to move things to the cloud so let’s take a look at a few of them. The cloud can cost a lot There is no such thing as a free lunch and the cloud is not excluded.  I am sure that we’ve all heard horror stories of individuals leaving resources active which in turned costed large sums of money. While the cloud offers up a wide range of capabilities in aiding the day-to-day life of IT professionals everywhere, it might not be cost effective for your given workload or data volumes. Compute resources and all things associated with that cost money.  If you need higher CPU, more money.  If you need terabytes of storage, more money.  If you need a higher CPU to memory ratio for that virtual machine, more money.  All of the resources the cloud offers you essential rent and the bigger the space, the more money it takes. Of course, all of this is dependent on your organizational requirements and associated workloads. By having an on-premises environment you can implement a lower cost of ownership for hardware.  This being said, the cloud offers up more efficient means of upgrade and scaling which is usually limited with on-premises ecosystems which can actually save you money.  It’s a trade-off that organizations have to weigh to see if moving to the cloud makes sense. You want control of all things Most things in the cloud require that organizations relinquish control.  That is just a plain fact and that’s not changing.  We are trading speed and agility from an infrastructure perspective for a lower ability to control certain aspects of the architecture.  For example, with Azure SQL Database (Platform as a Service), database administrators no longer can control database backup method or frequency.  In exchange for this loss of control, though, backups are taken automatically for us. In my opinion, this is a more than fair exchange and I sleep better knowing that a tried and vetted backup process is taking care of things without my intervention. You have specific compliance or regulation requirements While most of the players in the public cloud space (Azure, Amazon, Google) are all certified for a multitude of compliance regulations, it’s possible that you have a very specific one that the provider is unable to meet.  If this is the case, then your ability to move to the cloud is limited and you are forced to remain on-premises.  Regulations could also impose issues when moving to that cloud.  These regulations could be imposed by the governing body of the organization or be sourced from various places.  If this is the case, it’s possible that the cloud is not a viable solution for your organization. I do suspect that as cloud technology continues to advance, regulations and compliances will slowly be brought into the fold and allow for appropriate cloud implementations. You do not have the expertise Put simply, you do not have the knowledge internally to successfully migrate to the cloud nor do you have the budget to hire someone to move you to the cloud.  Shameless plug, this one of our core competencies here at Denny Cherry & Associates Consulting.  We help organizations (big or small) get into the cloud to help push their data ecosystem forward.  However, not every organization can afford to hire consultants (short or long term) to help them with such a project.  In this instance, until you can get the expertise to help you are left with either staying on-premises or trying to figure it out on your own.  In some respects, the cloud opens new security exposures that must be accounted for when moving to it.  If these are not accounted for the organization severe issues could arise so I recommend not going down the “we’ll figure it out as we go” method without some level of guidance. Your workloads do not perform in the cloud Even though I am a huge fan of Azure, some workloads just won’t perform well unless you break out your wallet (see the first paragraph).  Even with proper performance tuning, the performance comparison between on-premises and the cloud is not going to be a true apples to apples comparison.  The infrastructure is just too vastly different to really get that “exact” level of comparison.  Organizations must find that sweet spot between performance infrastructure costs and frankly, sometimes that sweet spot dictates remaining with on-premises hardware. Summary There are probably many other reasons why on-premises infrastructures will continue to be relevant.  Each organization may have unique requirements that having SQL Server on their own hardware is the only solution.  Remember, regardless of where you deploy SQL Server, it is just SQL and it’ll behave the same (mostly).  This does not mean that you should not continue to expand your skill sets.  Make sure to continue to learn about cloud technologies so that when your organization is ready to make the leap, you can do so in a safe and secure manner. © 2020, John Morehouse. All rights reserved. The post Is On-premises SQL Server Still Relevant? first appeared on John Morehouse. The post Is On-premises SQL Server Still Relevant? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 721
Banner background image

Anonymous
11 Dec 2020
2 min read
Save for later

Daily Coping 11 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to leave a positive message for someone else to find. An unexpected note is something that can brighten someone’s day. If it’s a positive message, that is, and that’s the goal today. One thing I do is keep chocolate around my house. My wife and daughter will have a craving periodically, usually every few weeks, and we’re not very close to any stores that sell chocolate they like. The cheap stuff at the local gas station doesn’t cut it. As a result, I keep a few bars around, usually hidden in the house. My wife will ask me for some, but my daughter will just start hunting. I decided to not take a chance and just write a couple nice notes and tape them to the bars. That way the next time they find one (or I give them one), they get a surprise message. The post Daily Coping 11 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 712

article-image-new-microsoft-data-governance-product-azure-purview-from-blog-posts-sqlservercentral
Anonymous
10 Dec 2020
3 min read
Save for later

New Microsoft data governance product: Azure Purview from Blog Posts - SQLServerCentral

Anonymous
10 Dec 2020
3 min read
Azure Purview is at data governance solution that is the sequel to the product Azure Data Catalog, and is now available in public preview. Purview catalogs data from on-premises, multi-cloud, or software-as-a-service (SaaS) locations. Purview let’s you understand exactly what data you have, manage its compliance with privacy regulations, and derive insights. Create a unified, up to date understanding of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage with Purview Data Map.  Purview aims to maximize the compliant use of a your own data by understanding it, how it moves (i.e. lineage), and who it is shared with. It also integrates with Microsoft Information Protection and Power BI sensitivity labels and certification and promotion labels. Azure Purview includes three main components: Data discovery, classification, and mapping: It will automatically find all of an organization’s data on-premises or in the cloud, even those that are managed by other providers, and evaluate the characteristics and sensitivity of the data as it scans it Data catalog: It enables all users to search for trusted data using a simple web-based search engine. There is also visual graphs that let you quickly see if data of interest is from a trusted source Data governance: It provides a bird’s-eye view of your company’s data landscape, enabling “data officers” to efficiently govern the use of data. This enables key insights such as the distribution of data across multiple environments, how data is moving, and where sensitive data is stored There is a sophisticated search engine to view all the scanned items: It tracks data lineage (click to expand): Below are the nine current different sources you can scan (more to come soon). I have got all the scans to work on all of the sources except Power BI as that requires a bit of extra work to scan a workspace different from the one in your subscription (by default, the system will use the Power BI tenant that exists in the same Azure subscription). To register a Power BI workspace outside your subscription, see Use PowerShell to register and scan Power BI in Azure Purview (preview). For those sources that are not supported, there is an option to submit data to the catalog via Azure Purview REST APIs. You can also use the APIs to build your own user experience on the catalog. You can also use a “map view” to see all the sources and group them under collections (click to expand): Azure Purview also comes with system defined classification rules but you can also add your own custom classification rules: To ramp-up quickly, I suggest you visit the Azure Purview product page. Get started with Azure Purview documentation and view the Mechanics video to see Azure Purview in action and give your feedback via UserVoice.  More info: A first look at Azure Purview – Data Governance for your data estate Azure Synapse Analytics – Introduction to Azure Purview Microsoft introduces Azure Purview data catalog; announces GA of Synapse Analytics Use Power BI with Azure Purview to achieve better data governance and discovery Map your data estate with Azure Purview The post New Microsoft data governance product: Azure Purview first appeared on James Serra's Blog. The post New Microsoft data governance product: Azure Purview appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1425

article-image-daily-coping-10-dec-2020-from-blog-posts-sqlservercentral
Anonymous
10 Dec 2020
1 min read
Save for later

Daily Coping 10 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
10 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to support a charity, cause, or campaign you really care about. I did this recently, when Jeff Atwood was raising money for the Georgia Senate races in the US. I donated, entered his contest, and won an iPad. I didn’t expect to win, and donated because I wanted to, but this was a bonus. That inspired me to donate as well to Kiva. I’ve sent them money in the past, and did so again, giving them a little, and then lending out more to help others trying to grow their own businesses. I’ve had good luck with repayments, which let’s me lend again, but even when someone fails at their business, I’m glad I helped them make an attempt. The post Daily Coping 10 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 728
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
Anonymous
10 Dec 2020
1 min read
Save for later

Azure SQL Database Connectivity from Blog Posts - SQLServerCentral

Anonymous
10 Dec 2020
1 min read
Have you ever wondered how your connection from outside of Azure to your database is handled? It is important to understand that there is a difference between route(s) from when connecting inside to that of outside of Azure. When outside … Continue reading ? The post Azure SQL Database Connectivity appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 961

Anonymous
09 Dec 2020
1 min read
Save for later

T-SQL Tuesday Retrospective #007: Summertime in the SQL from Blog Posts - SQLServerCentral

Anonymous
09 Dec 2020
1 min read
This is the seventh post in my retrospective attempt to answer every T-SQL Tuesday invitation. In the beginning of June 2010, Jorge Segarra invited us to write about our favourite hot new feature in SQL Server 2008 or 2008 R2. For me, this would be the introduction of the DATE, TIME, and DATETIME2 data types-> Continue reading T-SQL Tuesday Retrospective #007: Summertime in the SQL The post T-SQL Tuesday Retrospective #007: Summertime in the SQL appeared first on Born SQL. The post T-SQL Tuesday Retrospective #007: Summertime in the SQL appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 912

Anonymous
09 Dec 2020
1 min read
Save for later

Daily Coping 9 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
09 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to give kind comments to as many people as possible today. I try to write these a few days in advance, so I’m going to repeat a few things I posted to others today. These were random tweets, messages, texts, etc. “That was a great presentation” “That code looks great” “Excellent way to express that” “You look happy today” “Your hair looks good” “That message really made my day. Thanks” The post Daily Coping 9 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 744

Anonymous
08 Dec 2020
2 min read
Save for later

Open Letter to the PASS Membership, from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
2 min read
When I was appointed to serve as a Director at Large for the PASS Organization a year ago, it was with great passion, excitement and enthusiasm. I had some great ideas about how I could help globalize, diversify and improve our wonderful community that I’ve been a member of since 2013. The PASS Community has provided me with friends and the opportunity to connect, share and learn.  I wanted to serve our community but have realised sadly that I cannot.  It is with a heavy heart that I am delivering this message today. On Monday 7th December 2020 (Pacific Time) I resigned from the PASS Board.  I had considered resigning from the board back in July this year, but wanted to continue and try and serve our community by continuing on the board. I wanted to bring change – I campaigned on it when running for the board this year. Ultimately I could not bring the change that I feel we needed and so this is the reason I am stepping down. To let others bring what they feel our community needs. I want to thank the community for the support they have given me. This was a very hard decision to make, but I’m a give it a 100% person and I tried. #KiaKaha PASS The post Open Letter to the PASS Membership, appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 812
article-image-t-sql-tuesday-133-what-else-have-i-learned-from-presenting-from-blog-posts-sqlservercentral
Anonymous
08 Dec 2020
2 min read
Save for later

T-SQL Tuesday #133: What (Else) Have I Learned from Presenting? from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
2 min read
T-SQL Tuesday is a monthly blog party in the SQL Community and this month is hosted by Lisa Bohm (b | t) and it’s about what else I have learned through presenting.  It has to be something technical did not relate to the presentation we were preparing.  We should all right about learning PowerPoint (jk) although I did learn to put on subtitles to get closed captioning. What I learned unintentionally is more about using Linux than I ever intended to know at that point and time.  We had implemented the TIG (Telgraf/InfluxDB/Grafana) stack at work but I wasn’t the one who implemented but had talked to the person who had about using in presentations and they were OK if I did, but Linux was something I hadn’t used in like 20 years. So first off, what flavor of Linux was I going to use.  So I took to Pluralsight to learn some basics of Linux taking the path down the RedHat certification path.  Where the instructor was using CentOS because it was free and close to RedHat in commands.  Wow that was a learning curve, everything was a command prompt taking me back to seventh grade and MSDOS 3.3 that I learned back then. First, I installed Influxdb and Grafana, got Telegraf going on my SQL Server instance but umm couldn’t even load the interface for Grafana to discover when you install a program on Linux unlike windows the services aren’t set to automatically set to start and not only that the ports aren’t open on the firewall.  Everything is locked down, which is probably a good thing.  But wow a lot of Googling happened. Then there was editing the config files for Telgraf on a Linux SQL Instance.  I had to learn how to navigate VIM.  Boy that brought back memories but none that helped me remember how to use it. All be said and done I need to learn some Linux lots of it to proficient in it.  So maybe I should go back and finish those Pluralsight classes, but I also want to learn containers and get Azure certified, geez which thing can I do?     The post T-SQL Tuesday #133: What (Else) Have I Learned from Presenting? first appeared on Tracy Boggiano's SQL Server Blog. The post T-SQL Tuesday #133: What (Else) Have I Learned from Presenting? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 688

Anonymous
08 Dec 2020
2 min read
Save for later

Daily Coping 8 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to contact someone you can’t be with to see how they are doing. One of the things I did early on in the pandemic was reach out every day or two to a few random people in my contact list. The pace of things slowed down across the summer, but I decided to have a few more in-depth conversations, rather than just a “how are you” query. I opened up a Facebook Messenger conversation with a friend recently, reaching out to see what they were up to, and how life was going. Across a few days, we exchanged numerous messages, touching base and having a conversation around the rest of our busy lives. On one hand I enjoyed the chance to reach out to someone and contact them. It was good to catch up and see how life was getting along. On the other, this brought some sadness, as I had planned on seeing this person across the summer, which didn’t happen. With the prospect of months more of living like this, I’m disheartened that I won’t see this person for awhile. Not a great coping day for me. The post Daily Coping 8 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 812

article-image-t-sql-tuesday-133-what-i-learned-from-presenting-from-blog-posts-sqlservercentral
Anonymous
08 Dec 2020
3 min read
Save for later

T-SQL Tuesday #133–What I Learned From Presenting from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
3 min read
This month Lisa Griffin Bohm is the host and thanks to her from hosting. She was one of the last people I pressured lightly to host. She came up with a very creative invite. She is asking us to share something technical that we learned, that wasn’t related to the actual presentation. While I’ve presented many times on various topics, and seen many more talks, I often go for a specific reason. At work it might be because I need to go, but at events, I usually pick a session based on the topic or presenter. Since most do a good job talking about their topic, I had to really think about this one. A Couple Choices I’ve got two that came to mind as I pondered what wasn’t related to the topic. One is small, with minor impact to my work. The other had a much larger impact to me. The first is a story where I was researching for a talk on Always Encrypted in 2016, I ran into an issue I hadn’t expected. This was Microsoft’s first big change to encryption options since 2005, and I was excited about it. I knew about the restrictions with data types, collation, etc, but they seemed to be acceptable to me. As I was building demos and working on how to show certificate movement, I created certificates in various ways, I created some in text files using the common PFX format. Little did I know that SQL Server can’t read these. Instead, you need to convert them to PVK, which isn’t well known. Many people use the .cer or .pfx, but for whatever reason, SQL Server doesn’t support those. The second story was while rehearsing for a talk at Build. Once again, I was speaking, but delivering only a small piece of a multi-person talk with a few Microsoft employees. Donovan Brown was one of them, and as we worked on timing, transitions, and tailored my section to fit, we had time to chat a bit. This was in the mid days of Visual Studio Team Services, which become Azure DevOps a short time later. As I was talking about some of the challenges I’d had in TFS, Donovan showed me a Java app he was maintaining as a side project, which was being built, tested, and released from VSTS. I was surprised, as Microsoft was still mostly focused on their own technology. He showed me some of the any language, any platform, any framework philosophy that was being used to make Azure DevOps a complete platform, not a Microsoft one. I was surprised, and I’ve continued to be as I watch new capabilities and features appear, few of which are tied to Microsoft products. That greatly impacted my career, and continues to do so today as I work with more and more customers that use non-Microsoft technologies. The post T-SQL Tuesday #133–What I Learned From Presenting appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 717
article-image-using-aws-sdk-with-go-for-ec2-ami-metrics-from-blog-posts-sqlservercentral
Anonymous
07 Dec 2020
8 min read
Save for later

Using Aws Sdk With Go for Ec2 Ami Metrics from Blog Posts - SQLServerCentral

Anonymous
07 Dec 2020
8 min read
Source The source code for this repo is located here: What This Is This is a quick overview of some AWS SDK Go work, but not a detailed tutorial. I’d love feedback from more experienced Go devs as well. Feel free to submit a PR with tweaks or suggestions, or just comment at the bottom (which is a GitHub issue powered comment system anyway). Image Age Good metrics can help drive change. If you identify metrics that help you quantify areas of progress in your DevOps process, you’ll have a chance to show the progress made and chart the wins. Knowing the age of the image underlying your instances could be useful if you wanted to measure how often instances were being built and rebuilt. I’m a big fan of making instances as immutable as possible, with less reliance on changes applied by configuration management and build oriented pipelines, and more baked into the image itself. Even if you don’t build everything into your image and are just doing “golden images”, you’ll still benefit from seeing the average age of images used go down. This would represent more continual rebuilds of your infrastructure. Containerization removes a lot of these concerns, but not everyone is in a place to go straight to containerization for all deployments yet. What Using the SDK Covers I decided this would be a good chance to use Go as the task is relatively simple and I already know how I’d accomplish this in PowerShell. If you are also on this journey, maybe you’ll find this detail inspiring to help you get some practical application in Go. There are a few steps that would be required: Connection & Authorization Obtain a List of Images Filtering required Obtain List of Instances Match Images to Instances where possible Produce artifact in file form Warning… I discovered that the SDK is pretty noisy and probably makes things a bit tougher than just plain idiomatic Go. If you want to learn pointers and derefrencing with Go… you’ll be a pro by the time you are done with it ?? Why This Could Be Useful In Learning More Go I think this is a pretty great small metric oriented collector focus as it ties in with several areas worth future versions. Since the overall logic is simple there’s less need to focus on understanding AWS and more on leveraging different Go features. Version 1: MVP that just products a JSON artifact Version 2: Wrap up in lambda collector and product s3 artifact Version 3: Persist metrics to Cloudwatch instead as a metric Version 4: Datadog or Telegraf plugin From the initial iteration I’ll post, there’s quite a bit of room for even basic improvement that my quick and dirty solution didn’t implement. Use channels to run parallel sessions to collect multi-region metrics in less time Use sorting with the structs properly would probably cut down on overhead and execution time dramatically. Timeseries metrics output for Cloudwatch, Datadog, or Telegraf Caveat Still learning Go. Posting this up and welcome any pull requests or comments (comments will open GitHubub issue automatically). There is no proper isolation of functions and tests applied. I’ve determined it’s better to produce and get some volume under my belt that focus on immediately making everything best practices. Once I’ve gotten more familar with Go proper structure, removing logic from main() and more will be important. This is not a complete walkthrough of all concepts, more a few things I found interesting along the way. Some Observations & Notes On V1 Attempt omitempty Writing to JSON is pretty straight forward, but what I found interesting was handling null values. If you don’t want the default initialized value from the data type to be populated, then you need to specific additional attributes in your struct to let it know how to properly serialize the data. For instance, I didn’t want to populate a null value for AmiAge as 0 would mess up any averages you were trying to calculate. type ReportAmiAging struct { Region string `json:"region"` InstanceID string `json:"instance-id"` AmiID string `json:"image-id"` ImageName *string `json:"image-name,omitempty"` PlatformDetails *string `json:"platform-details,omitempty"` InstanceCreateDate *time.Time `json:"instance-create-date"` AmiCreateDate *time.Time `json:"ami-create-date,omitempty"` AmiAgeDays *int `json:"ami-age-days,omitempty"` } In this case, I just set omitempty and it would set to null if I passed in a pointer to the value. For a much more detailed walk-through of this: Go’s Emit Empty Explained Multi-Region Here things got a little confusing as I really wanted to run this concurrently, but shelved that for v1 to deliver results more quickly. To initialize a new session, I provided my starting point. sess, err := session.NewSession(&aws.Config{ Region: aws.String("eu-west-1"), }, ) if err != nil { log.Err(err) } log.Info().Str("region", string(*sess.Config.Region)).Msg("initialized new session successfully") Next, I had to gather all the regions. In my scenario, I wanted to add flexibility to ignore regions that were not opted into, to allow less regions to be covered when this setting was correctly used in AWS. // Create EC2 service client client := ec2.New(sess) regions, err := client.DescribeRegions(&ec2.DescribeRegionsInput{ AllRegions: aws.Bool(true), Filters: []*ec2.Filter{ { Name: aws.String("opt-in-status"), Values: []*string{aws.String("opted-in"), aws.String("opt-in-not-required")}, }, }, }, ) if err != nil { log.Err(err).Msg("Failed to parse regions") os.Exit(1) } The filter syntax is pretty ugly. Due to the way the SDK works, you can’t just pass in *[]string{"opted-in","opt-in-not-required} and then reference this. Instead, you have to se the AWS functions to create pointers to the strongs and then dereference it apparently. Deep diving into this further was beyond my time alloted, but definitely a bit clunky. After gathering the regions you’d iterate and create a new session per region similar to this. for _, region := range regions.Regions { log.Info().Str("region", *region.RegionName).Msg("--> processing region") client := ec2.New(sess, &aws.Config{Region: *&region.RegionName}) // Do your magic } Structured Logging I’ve blogged about this before (mostly on microblog). As a newer gopher, I’ve found that zerolog is pretty intuitive. Structured logging is really important to being able to use log tools and get more value out of your logs in the future, so I personally like the idea of starting with them from the beginning. Here you could see how you can provide name value pairs, along with the message. log.Info().Int("result_count", len(respInstances.Reservations)).Dur("duration", time.Since(start)).Msg("tresults returned for ec2instances") Using this provided some nice readable console feedback, along with values that a tool like Datadog’s log parser could turn into values you could easily make metrics from. Performance In Searching From my prior blog post Filtering Results In Go I also talked about this. The lack of syntactic sugar in Go means this seemed much more verbose than I was expecting. A few key things I observed here were: Important to set your default layout for time if you want any consistency. Sorting algorithms, or even just basic sorting, would likely reduce the overall cost of a search like this (I’m better pretty dramatically) Pointers. Everywhere. Coming from a dynamic scripting language like PowerShell/Python, this is a different paradigm. I’m used to isolated functions which have less focus on passing values to modify directly (by value). In .NET you can pass in variables by reference, which is similar in concept, but it’s not something I found a lot of use for in scripting. I can see the massive benefits when at scale though, as avoiding more memory grants by using existing memory allocations with pointers would be much more efficient. Just have to get used to it! // GetMatchingImage will search the ami results for a matching id func GetMatchingImage(imgs []*ec2.Image, search *string) (parsedTime time.Time, imageName string, platformDetails string, err error) { layout := time.RFC3339 //"2006-01-02T15:04:05.000Z" log.Debug().Msgf("tttsearching for: %s", *search) // Look up the matching image for _, i := range imgs { log.Trace().Msgf("ttt%s <--> %s", *i.ImageId, *search) if strings.ToLower(*i.ImageId) == strings.ToLower(*search) { log.Trace().Msgf("ttt %s == %s", *i.ImageId, *search) p, err := time.Parse(layout, *i.CreationDate) if err != nil { log.Err(err).Msg("tttfailed to parse date from image i.CreationDate") } log.Debug().Str("i.CreationDate", *i.CreationDate).Str("parsedTime", p.String()).Msg("tttami-create-date result") return p, *i.Name, *i.PlatformDetails, nil // break } } return parsedTime, "", "", errors.New("tttno matching ami found") } Multiple Return Properties While this can be done in PowerShell, I rarely did it in the manner Go does. amiCreateDate, ImageName, platformDetails, err := GetMatchingImage(respPrivateImages.Images, inst.ImageId) if err != nil { log.Err(err).Msg("failure to find ami") } Feedback Welcome As stated, feedback welcome from any more experienced Gophers would be welcome. Anything for round 2. Goals for that will be at a minimum: Use go test to run. Isolate main and build basic tests for each function. Decide to wrap up in lambda or plugin. #tech #development #aws #golang #metrics The post Using Aws Sdk With Go for Ec2 Ami Metrics appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 716

article-image-data-architecture-blog-post-ci-cd-in-azure-synapse-analytics-part-1-from-blog-posts-sqlservercentral
Anonymous
07 Dec 2020
1 min read
Save for later

Data Architecture Blog Post: CI CD in Azure Synapse Analytics Part 1 from Blog Posts - SQLServerCentral

Anonymous
07 Dec 2020
1 min read
Hello Dear Reader! It's been a while. I've got a new blog post over on the Microsoft Data Architecture Blog on using Azure Synapse Analytics titled,  CI CD in Azure Synapse Analytics Part 1 .  I'm not sure how many numbers will be in this series. I have at least 2 planned. We will see after that. So head over and read up my friends!   As always. Thank you for stopping by.   Thanks,   Brad. The post Data Architecture Blog Post: CI CD in Azure Synapse Analytics Part 1 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1491