Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-closure-with-the-professional-organization-for-sql-server-from-blog-posts-sqlservercentral
Anonymous
11 Dec 2020
5 min read
Save for later

Closure with the Professional Organization for SQL Server from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
5 min read
I know that this isn’t the correct name, though the by-laws still list this as the corporation. Perhaps this is one more sign of the failure to evolve and grow that I’ve felt from the organization. Three directors of the PASS Board of Directors resigned this week. Mindy Curnutt, longtime member, volunteer, and advocate for the community was first, with Melody Zacharias and Hamish Watson following suit. You can read their open letters (Mindy, Melody, Hamish), which appear to hint at some sort of disagreement, argument, ethical or moral failure, or maybe just anger. I have no idea what happened, I’m curious as to what it could be, but I also think this might be something better left to fade away. I’ve had my disagreements with the organization in the past, and I certainly think the culture and governance of the executive board is broken. In the aftermath of outcries from various prominent members of the community, Grant Fritchey left a note that there are legal issues as to what is happening now. I do not know if these are financial/debt issues or something else, and I am not speculating on what these are. I do appreciate Grant’s engagement with the community, and in my memory, since Kevin Kline, he has been one of the very, very few to actually engage with the community on controversial issues. I haven’t always agreed with him, but I respected and appreciated the effort. However, most directors that have served on the executive committee, which includes the Executive Director from C&C, release very little information. Updates take place relatively rarely and little is proposed or discussed with members publicly. There is no law or legal liability that would prevent an announcement of acknowledgement of the resignation of directors or a news release that thanks Mindy, Melody, and Hamish for their service. No penalty for noting they have resigned. Here, I’ll make it easy for you on Twitter: @SQLPASS: Today Mindy Curnutt resigned from the PASS Board of Directors. We thank Mindy for her many years of service and wish her well in future endeavors. @SQLPASS: Today Melody Zacharias resigned from the PASS Board of Directors. We thank Melody for her many years of service and wish her well in future endeavors. @SQLPASS: Today Hamish Watson resigned from the PASS Board of Directors. We thank Hamish for his many years of service and wish him well in future endeavors. No copyright here, feel free to cut and paste. It would be even easier to drop these three notes on the https://www.pass.org/About-PASS/PASS-News page, because, well, this is news. Instead, we have the same lack of engagement, trust, respect, and leadership  that have permeated the culture for well over a decade. No accountability to the membership, or the board of directors, that I can see. Whether this is the appointed members (President, VP-Finance, VP-Marketing), and/or the executive director, they operate indepedently of the community and the board. This has been my primary complaint, and I suspect also, the complaint from many in the SQLFamily community. I see no reason for a large organization to exist primarily to run a profitable conference that pays salaries and bonuses to an organization that are de facto employees, with management that doesn’t seek to be a part of the community. As I write that, I’m saddened, mostly for the employees of C&C. Over the years I have had many opportunities to work with Marcella, Craig, Anika, Leeza, Erick, Audrey, and likely others I am forgetting. They have worked hard to ensure events have run smoothly, and I’ve appreciated their help and assistance in various matters. They have been a part of the community and I hope they continue to do so. I’m saddened that they may find themselves cast aside if the organization fails. I do hope they receive proper notice and compensation if this is the case. My one regret in all of this is that SQL Saturday is inextricably bound up in the legal mechanisms of PASS. Andy Warren, Brian Knight, and I gifted this to the organization, trusting they would be good stewards of the events. They have been, and I know that these events will continue, either under this moniker or another. Our community is too strong to let these lapse. We will find a way for these to continue, whether with PASS or not. I hope Microsoft continues to support community events and organizations, but I do not hope they provide any more assistance to the PASS organization. The lack of governance and transparency along with the poor culture of the executive committee and management company in engaging with the community, lead me to the conclusion this is not the place to invest and engage with a community. The organization has helped a strong community grow over the years, but it has outlived its usefulness. It is time for an evolution to something new that better exists to serve the community rather than the organization itself. I continue to support SQL Saturday events and chapters regardless of affiliation. The post Closure with the Professional Organization for SQL Server appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 775

article-image-find-columns-with-null-values-across-the-table-from-blog-posts-sqlservercentral
Anonymous
11 Dec 2020
4 min read
Save for later

Find columns with NULL values across the table from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
4 min read
Recently, I was working on one of the performance tuning assignment with an esteemed client. The size of their databases were growing tremendously. Growth in the database size is directly proportionate to disk space requirement. More the database size will be, more disk space will be required. When we talk about the disk space for Production, it’s not only specific to one server. There are multiple factors that needs to be considered such as Primary server, Secondary servers, Backup servers and Backups (Full, Differential, Log backups). Long time ago, I blogged about similar topic. Read the full article here. While we were discussing the various possibilities to reduce the database size, client brought up a very interesting point in front of me. He said “My database has many huge tables (with size beyond 500 GB’s). Most of them have multiple columns which has null values across the table.” Although, client was aware that there are many such tables and columns but there was not definite list that client has. This has opened a whole new arena for me and gave a unique insights. I started thinking to build some logic to fetch the list of all such columns and tables. You will be amazed to hear but it’s true. There were more than 4K such columns belonging to 500+ tables. We estimated that we could save TB’s of disk space by marking all such columns as SPARSE. More on SPARSE columns can be read here at Microsoft official documentation. In this article we’ll see how to get the list of columns which has null values across all the rows of the table. Note: Running the query for all the tables together would take lot of time. It is recommended to run the query for set of tables in batches parallely in multiple instances of SSMS or with the help of jobs. In the next article, we’ll talk about how to estimate the storage savings by removing such columns or marking them as SPARSE. IF NOT EXISTS (SELECT 1 FROM sys.tables WHERE name = 'tables_with_null_values_across') BEGIN CREATE TABLE tables_with_null_values_across ( TableName VARCHAR(200) , TotalRows BIGINT , ColumnName VARCHAR(200) ) END /* DROP TABLE IF EXISTS #nullable_columns; DROP TABLE IF EXISTS #tables_with_nullable_columns; */ SET NOCOUNT ON SELECT A.object_id , B.name AS TableName , A.name AS ColumnName , ROW_NUMBER() OVER(PARTITION BY A.object_id ORDER BY A.name ASC) AS RowID INTO #nullable_columns FROM sys.columns A INNER JOIN sys.tables B ON B.object_id = A.object_id AND B.type = 'U' INNER JOIN tables_with_same_values_across C ON C.TableName = QUOTENAME(B.name) AND C.ColumnName = QUOTENAME(A.Name) LEFT JOIN tables_with_null_values_across D ON D.TableName = C.TableName AND D.ColumnName = C.ColumnName WHERE A.is_nullable = 1 AND D.TableName IS NULL -- AND B.name IN ('', '') -- Note: Supply the table names in the filter clause for B.name in order to run the query in batches of tables. SELECT DISTINCT A.object_id , B.name AS TableName , IDENTITY(INT, 1, 1) AS RowID INTO #tables_with_nullable_columns FROM #nullable_columns A INNER JOIN sys.tables B ON B.object_id = A.object_id DECLARE @TableName AS SYSNAME , @ColumnName AS SYSNAME , @Object_ID AS INT , @Table_RowID AS INT , @Column_RowID AS INT , @Column_Distinct_Values_Count AS BIGINT , @Total_Rows AS BIGINT SET @Table_RowID = 1; WHILE EXISTS (SELECT 1 FROM #tables_with_nullable_columns WHERE RowID = @Table_RowID) BEGIN SELECT @Object_ID = object_id , @TableName = TableName FROM #tables_with_nullable_columns WHERE RowID = @Table_RowID; SET @Column_RowID = 1; WHILE EXISTS (SELECT 1 FROM #nullable_columns WHERE object_id = @Object_ID AND RowID = @Column_RowID) BEGIN SELECT @ColumnName = ColumnName FROM #nullable_columns WHERE object_id = @Object_ID AND RowID = @Column_RowID; DECLARE @SQLString NVARCHAR(500); SET @SQLString = N'SELECT @Column_Distinct_Values_Count = COUNT(DISTINCT ' + QUOTENAME(@ColumnName) + ') , @Total_Rows = COUNT(1) FROM ' + QUOTENAME(@TableName) + ' WITH (NOLOCK)'; BEGIN TRY EXECUTE sp_executesql @SQLString , N'@Total_Rows BIGINT OUTPUT, @Column_Distinct_Values_Count BIGINT OUTPUT' , @Total_Rows = @Total_Rows OUTPUT , @Column_Distinct_Values_Count = @Column_Distinct_Values_Count OUTPUT ; END TRY BEGIN CATCH END CATCH IF (@Column_Distinct_Values_Count = 0) BEGIN INSERT INTO tables_with_null_values_across (TableName, TotalRows, ColumnName) VALUES (QUOTENAME(@TableName), @Total_Rows, QUOTENAME(@ColumnName)) END SET @Column_RowID = @Column_RowID + 1; END SET @Table_RowID = @Table_RowID + 1; END DROP TABLE IF EXISTS #nullable_columns; DROP TABLE IF EXISTS #tables_with_nullable_columns; The post Find columns with NULL values across the table appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 774

article-image-is-on-premises-sql-server-still-relevant-from-blog-posts-sqlservercentral
Anonymous
11 Dec 2020
5 min read
Save for later

Is On-premises SQL Server Still Relevant? from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
5 min read
Unequivocally, yes on-premises SQL Server Instances are still relevant. While I’m a firm believer that the cloud is not a fad and is not going away, it’s just an extension of a tool that we are already familiar with.  The Microsoft marketing slogan is “It’s just SQL” and for the most part that is indeed true.  However, that does not mean that every workload will benefit from being in the cloud.  There are scenarios where it does not make sense to move things to the cloud so let’s take a look at a few of them. The cloud can cost a lot There is no such thing as a free lunch and the cloud is not excluded.  I am sure that we’ve all heard horror stories of individuals leaving resources active which in turned costed large sums of money. While the cloud offers up a wide range of capabilities in aiding the day-to-day life of IT professionals everywhere, it might not be cost effective for your given workload or data volumes. Compute resources and all things associated with that cost money.  If you need higher CPU, more money.  If you need terabytes of storage, more money.  If you need a higher CPU to memory ratio for that virtual machine, more money.  All of the resources the cloud offers you essential rent and the bigger the space, the more money it takes. Of course, all of this is dependent on your organizational requirements and associated workloads. By having an on-premises environment you can implement a lower cost of ownership for hardware.  This being said, the cloud offers up more efficient means of upgrade and scaling which is usually limited with on-premises ecosystems which can actually save you money.  It’s a trade-off that organizations have to weigh to see if moving to the cloud makes sense. You want control of all things Most things in the cloud require that organizations relinquish control.  That is just a plain fact and that’s not changing.  We are trading speed and agility from an infrastructure perspective for a lower ability to control certain aspects of the architecture.  For example, with Azure SQL Database (Platform as a Service), database administrators no longer can control database backup method or frequency.  In exchange for this loss of control, though, backups are taken automatically for us. In my opinion, this is a more than fair exchange and I sleep better knowing that a tried and vetted backup process is taking care of things without my intervention. You have specific compliance or regulation requirements While most of the players in the public cloud space (Azure, Amazon, Google) are all certified for a multitude of compliance regulations, it’s possible that you have a very specific one that the provider is unable to meet.  If this is the case, then your ability to move to the cloud is limited and you are forced to remain on-premises.  Regulations could also impose issues when moving to that cloud.  These regulations could be imposed by the governing body of the organization or be sourced from various places.  If this is the case, it’s possible that the cloud is not a viable solution for your organization. I do suspect that as cloud technology continues to advance, regulations and compliances will slowly be brought into the fold and allow for appropriate cloud implementations. You do not have the expertise Put simply, you do not have the knowledge internally to successfully migrate to the cloud nor do you have the budget to hire someone to move you to the cloud.  Shameless plug, this one of our core competencies here at Denny Cherry & Associates Consulting.  We help organizations (big or small) get into the cloud to help push their data ecosystem forward.  However, not every organization can afford to hire consultants (short or long term) to help them with such a project.  In this instance, until you can get the expertise to help you are left with either staying on-premises or trying to figure it out on your own.  In some respects, the cloud opens new security exposures that must be accounted for when moving to it.  If these are not accounted for the organization severe issues could arise so I recommend not going down the “we’ll figure it out as we go” method without some level of guidance. Your workloads do not perform in the cloud Even though I am a huge fan of Azure, some workloads just won’t perform well unless you break out your wallet (see the first paragraph).  Even with proper performance tuning, the performance comparison between on-premises and the cloud is not going to be a true apples to apples comparison.  The infrastructure is just too vastly different to really get that “exact” level of comparison.  Organizations must find that sweet spot between performance infrastructure costs and frankly, sometimes that sweet spot dictates remaining with on-premises hardware. Summary There are probably many other reasons why on-premises infrastructures will continue to be relevant.  Each organization may have unique requirements that having SQL Server on their own hardware is the only solution.  Remember, regardless of where you deploy SQL Server, it is just SQL and it’ll behave the same (mostly).  This does not mean that you should not continue to expand your skill sets.  Make sure to continue to learn about cloud technologies so that when your organization is ready to make the leap, you can do so in a safe and secure manner. © 2020, John Morehouse. All rights reserved. The post Is On-premises SQL Server Still Relevant? first appeared on John Morehouse. The post Is On-premises SQL Server Still Relevant? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 718

article-image-daily-coping-11-dec-2020-from-blog-posts-sqlservercentral
Anonymous
11 Dec 2020
2 min read
Save for later

Daily Coping 11 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to leave a positive message for someone else to find. An unexpected note is something that can brighten someone’s day. If it’s a positive message, that is, and that’s the goal today. One thing I do is keep chocolate around my house. My wife and daughter will have a craving periodically, usually every few weeks, and we’re not very close to any stores that sell chocolate they like. The cheap stuff at the local gas station doesn’t cut it. As a result, I keep a few bars around, usually hidden in the house. My wife will ask me for some, but my daughter will just start hunting. I decided to not take a chance and just write a couple nice notes and tape them to the bars. That way the next time they find one (or I give them one), they get a surprise message. The post Daily Coping 11 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 711
Banner background image

article-image-three-ways-to-quickly-analyze-fundraising-performance-with-tableau-from-whats-new
Anonymous
10 Dec 2020
6 min read
Save for later

Three Ways to Quickly Analyze Fundraising Performance with Tableau from What's New

Anonymous
10 Dec 2020
6 min read
Jarrett O’Brien Nonprofit Cloud Product Marketing Director, Salesforce Kristin Adderson December 10, 2020 - 4:00pm December 10, 2020 2020 has delivered so many unknowns in terms of how nonprofits operate, convene supporters, and plan for the future. All these unknowns have come with financial uncertainty—which forces leadership to make tough decisions and challenging pivots to keep their organizations thriving. These decisions are best made with strong conviction and a foundation of rock-solid data.  Eric Dayton, Director of Data at buildOn, faced many unknowns when he came home from Malawi on March 3rd, a week before the COVID shutdown. Fortunately, buildOn’s ongoing investment in their digital transformation helped the organization shift gears smoothly and chart a course of action during the early days of the pandemic.  Dayton shared how their investment in digital helped: “Data and transparency are leading tenets for buildOn and the communities we serve. If we hadn't dug deep and done all the work to digitize our mission over the last few years, we wouldn't be so well set up to succeed.” Dayton was able to help his organization quickly pivot their fundraising and programs to continue to achieve their goals. For many organizations, the right technology, data, and strategy can make all the difference. A misplaced metric can erode trust in a board or funder meeting, but the right one can get your program funded. Using robust data to inform your decisions can help your nonprofit become more agile—while lack of data can hold nonprofits back from making decisions at all. “We grew up in the 1990s and early 2000s. We were spreadsheet-based, with simple, digestible KPIs presented to main stakeholders in a basic Excel file,” shared Dayton. “That spreadsheet grew into a system teams relied on, but it didn’t function or scale well. Nonprofits think these manual systems are helping their business, but they’re actually the source of 90% of the organization’s problems, especially when it comes to analyzing large amounts of data.”  Digital-first thinking from buildOn supports a data-driven culture that empowers staff to lead with confidence and navigate uncertain times. And they’re not alone in finding success in a digital-forward approach: In our 3rd edition of the Nonprofit Trends Report, we saw 27% of organizations with high digital maturity exceed their fundraising goals during the pandemic, compared to organizations with low digital maturity exceeding only 7% of their goals.   Tableau Dashboard 3rd Edition of Nonprofit Trends Report showing nonprofit organizations that exceeded goals by digital maturity. In an environment that’s shifting and evolving constantly, nonprofit fundraising leaders are seeking answers to these urgent questions:  What is our revenue health, and how is it trending? Which effective fundraising strategies should we pursue? How are campaigns performing, based on actual dollars raised? To help fundraising professionals get answers to these questions, we are excited to share with you our new Tableau Dashboards for Nonprofit Fundraising, which leverage the power of Tableau and the Nonprofit Success Pack (NPSP). Product Manager Mike Best had clear directives for this initiative. “Our customers told us they wanted a holistic picture of their fundraising effectiveness that was not only easy to understand, but—just as important—easy to implement.”  Best worked with colleagues who previously worked in development operations, our customers, and analytics experts to help fundraising professionals get the information they needed deployed more quickly in their work.   For Eric Dayton at buildOn, that meant being able to move full speed ahead with Tableau for Fundraising to unlock their donor data. “We were able to quickly deploy Tableau Starter Dashboards for Salesforce Nonprofit Cloud to unlock our donor data, forecast more effectively, and visualize revenue performance. Analytics allow us to make data-driven decisions across teams, which will allow us to navigate 2021 with greater impact.” Here are three ways the Tableau Dashboards for Nonprofit Fundraising can help you unlock your data and make decisions for the future of your organization, with confidence.  1. Understand & visualize revenue health Are you fielding questions like, “How’s the forecast?” or “Are our average donor contributions going up or down?” Being able to easily visualize and share this information builds toward a more data-driven culture. Clean data that’s easy to see and interpret helps to streamline the decision-making process in times of uncertainty, and navigate that uncertainty with more ease.  The fundraising overview dashboard helps you quickly scan revenue from a benchmark of last year or your revenue goals for the end of this year, uncovering both risks and opportunities. You can then dive into monthly performance, based on the number of donors, location, average gift amount, and the value of each donor or type of campaign.   Tableau Dashboards for Nonprofit Fundraising Revenue overview. 2. Create effective fundraising strategies For example, once you know that revenue is going down in July, or you’re off course to hit an important goal, the next step is to figure out where you might be able to shift strategies and get back on track.  The next tab helps you compare key statistics around new, retained, reactivated, recurring, and lapsed donors. This worksheet gives you a quick snapshot of donors, revenue, average gift amount, and more. Fundraising leaders can even zone in on a type of donation—say mid-level—to scan where that money is coming from. Tableau Dashboards for Nonprofit Fundraising donor acquisition, retention, and churn. 3. Drive campaign performance Once you have a strategy to reach those donors, it’s incredibly valuable to have the capacity to see how those campaigns or changes to channel tactics are performing.  With the campaign efficacy dashboard, you can understand which campaigns drive the most revenue, benchmark against campaigns with similar strategies or launched in close proximity to each other, and glean campaign trends over time. Tableau Dashboards for Nonprofit Fundraising campaign efficacy. With Tableau Dashboards for Nonprofit Fundraising, you can democratize data and drive decisions that help your mission thrive in the new normal. Tableau helps you take the next step to give your people the power of data, whether they’re a fundraising leader reading a report on their phone or a funder visiting your website.  To learn more about Tableau and ways to integrate and scale your analytics, check out the Tableau Basics for Nonprofits trail on Trailhead.
Read more
  • 0
  • 0
  • 1798

article-image-new-microsoft-data-governance-product-azure-purview-from-blog-posts-sqlservercentral
Anonymous
10 Dec 2020
3 min read
Save for later

New Microsoft data governance product: Azure Purview from Blog Posts - SQLServerCentral

Anonymous
10 Dec 2020
3 min read
Azure Purview is at data governance solution that is the sequel to the product Azure Data Catalog, and is now available in public preview. Purview catalogs data from on-premises, multi-cloud, or software-as-a-service (SaaS) locations. Purview let’s you understand exactly what data you have, manage its compliance with privacy regulations, and derive insights. Create a unified, up to date understanding of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage with Purview Data Map.  Purview aims to maximize the compliant use of a your own data by understanding it, how it moves (i.e. lineage), and who it is shared with. It also integrates with Microsoft Information Protection and Power BI sensitivity labels and certification and promotion labels. Azure Purview includes three main components: Data discovery, classification, and mapping: It will automatically find all of an organization’s data on-premises or in the cloud, even those that are managed by other providers, and evaluate the characteristics and sensitivity of the data as it scans it Data catalog: It enables all users to search for trusted data using a simple web-based search engine. There is also visual graphs that let you quickly see if data of interest is from a trusted source Data governance: It provides a bird’s-eye view of your company’s data landscape, enabling “data officers” to efficiently govern the use of data. This enables key insights such as the distribution of data across multiple environments, how data is moving, and where sensitive data is stored There is a sophisticated search engine to view all the scanned items: It tracks data lineage (click to expand): Below are the nine current different sources you can scan (more to come soon). I have got all the scans to work on all of the sources except Power BI as that requires a bit of extra work to scan a workspace different from the one in your subscription (by default, the system will use the Power BI tenant that exists in the same Azure subscription). To register a Power BI workspace outside your subscription, see Use PowerShell to register and scan Power BI in Azure Purview (preview). For those sources that are not supported, there is an option to submit data to the catalog via Azure Purview REST APIs. You can also use the APIs to build your own user experience on the catalog. You can also use a “map view” to see all the sources and group them under collections (click to expand): Azure Purview also comes with system defined classification rules but you can also add your own custom classification rules: To ramp-up quickly, I suggest you visit the Azure Purview product page. Get started with Azure Purview documentation and view the Mechanics video to see Azure Purview in action and give your feedback via UserVoice.  More info: A first look at Azure Purview – Data Governance for your data estate Azure Synapse Analytics – Introduction to Azure Purview Microsoft introduces Azure Purview data catalog; announces GA of Synapse Analytics Use Power BI with Azure Purview to achieve better data governance and discovery Map your data estate with Azure Purview The post New Microsoft data governance product: Azure Purview first appeared on James Serra's Blog. The post New Microsoft data governance product: Azure Purview appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1425
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-daily-coping-10-dec-2020-from-blog-posts-sqlservercentral
Anonymous
10 Dec 2020
1 min read
Save for later

Daily Coping 10 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
10 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to support a charity, cause, or campaign you really care about. I did this recently, when Jeff Atwood was raising money for the Georgia Senate races in the US. I donated, entered his contest, and won an iPad. I didn’t expect to win, and donated because I wanted to, but this was a bonus. That inspired me to donate as well to Kiva. I’ve sent them money in the past, and did so again, giving them a little, and then lending out more to help others trying to grow their own businesses. I’ve had good luck with repayments, which let’s me lend again, but even when someone fails at their business, I’m glad I helped them make an attempt. The post Daily Coping 10 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 681

article-image-azure-sql-database-connectivity-from-blog-posts-sqlservercentral
Anonymous
10 Dec 2020
1 min read
Save for later

Azure SQL Database Connectivity from Blog Posts - SQLServerCentral

Anonymous
10 Dec 2020
1 min read
Have you ever wondered how your connection from outside of Azure to your database is handled? It is important to understand that there is a difference between route(s) from when connecting inside to that of outside of Azure. When outside … Continue reading ? The post Azure SQL Database Connectivity appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 920

article-image-join-tableau-at-dreamtx-2020-heres-what-you-need-to-know-from-whats-new
Anonymous
09 Dec 2020
4 min read
Save for later

Join Tableau at DreamTX 2020—here’s what you need to know from What's New

Anonymous
09 Dec 2020
4 min read
Kristin Adderson December 9, 2020 - 7:54pm December 9, 2020 When Tableau joined the Salesforce family, we were excited to accelerate and extend our mission to help people see and understand data. That’s what building a Data Culture is all about! We started that work last year by bringing the Tableau Data Fam to Dreamforce. We got to know the vibrant Trailblazer community and introduced people to the innovative analytics platform that helps us deliver on the promise of our mission—through demos, mini-sessions, and a rockstar keynote. YTableau Data Fam joins the 2019 DreamforceWhile we can't get together in person this year, we're making the most of the annual conference and transforming Dreamforce into a virtual format called Dreamforce To You. We kicked off the event with a keynote on December 2, and now we want to invite you to join us for a special four-day event: DreamTX. Open to all Trailblazers, including the Tableau Data Fam, DreamTX takes place December 14-17, and is free for anyone to join. We’ll be bringing together people from all around the world to learn, connect, and share—directly from our homes. If you’re new to Salesforce, you might be wondering what DreamTX is all about and whether you should take part. So, here’s our guide to help you get the most from DreamTX. What is DreamTX? DreamTX—Dreamforce Trailblazer Experience—is a four-day virtual event jam-packed with demos, luminary sessions, customer success stories, and conversations with leadership with times friendly for Americas, Asia Pacific and Europe. With something for every line of business and every industry, you’ll have tons of opportunities to learn from customers and peers who have built resilience through 2020—all right from your couch. You’ll learn how the Customer 360 transforms businesses, hear amazing stories of leadership during a pandemic, hang out and connect with other Trailblazers, and even welcome some surprise guests for entertainment. Best of all, it’s free to everyone, making it the most inclusive Dreamforce ever. What will I learn? You can be sure that Tableau is rolling into DreamTX energized and ready to share how critical a Data Culture is to empower everyone. We’ll feature several sessions all about analytics, integration, and digital transformation, as well as a vision and roadmap session for the Tableau platform and Tableau CRM (formerly Einstein Analytics). Be sure to bookmark the “Unleash the Power of Data: Mulesoft and Tableau” session to learn how business and IT leaders can unlock data from disconnected applications to get actionable insights in one place—in Tableau of course! And check out “AI Predictions with Einstein Discovery” to learn about our newest AI integration. This session will teach you how to build and deploy trusted ML-powered predictions in Tableau with no code required, enabling more teams to benefit from the power of AI.  Ready to register and start planning your schedule? How do I get ready for DreamTX? Whether it’s your first Salesforce event or you’re a seasoned veteran, follow these tips to make the most of your DreamTX experience. 1. Register at Dreamforce.com Head over to Dreamforce.com and reserve your spot by clicking the “Sign Me Up” button in the top right corner.  If you’ve already created your Trailblazer Profile (say, from Dreamforce 2019), you can log in right away with your existing information. Otherwise, sign up in just a few steps to get in on all the learning and networking benefits that DreamTX and the Trailblazer Community provide! 2. Build your schedule Each DreamTX session will be 20–25 minutes long, spanning different themed channels. You can use Trail Maps as guides to which sessions are most relevant to you—for example, select “Analytics” from the drop-down menu to see the recommended sessions for our data rockstars. Then simply click on a session title and select “Bookmark” to save it to your personal DreamTX agenda, or “Add to Calendar” to save to your personal calendar. You can also view all sessions available and add sessions to your calendar using the Agenda Builder. (There’s a short video featured on that webpage to explain how.)  And don’t forget to check out the demos and workshops available on day four of Dream TX! They’re great learning opportunities on topics like the Tableau Viz Lightning Web Component, building advanced reports, and the Tableau CRM developer experience.  3. Enjoy DreamTX! When it’s time, throw on your most comfortable clothes, grab a snack, and head over to your favorite spot on the couch to get watching. That’s the beauty of DreamTX being virtual—everyone’s invited to participate and learn! We look forward to seeing our Data Fam there. Get started by registering now, and joining in the conversation on social at #DreamTX.
Read more
  • 0
  • 0
  • 1482

article-image-t-sql-tuesday-retrospective-007-summertime-in-the-sql-from-blog-posts-sqlservercentral
Anonymous
09 Dec 2020
1 min read
Save for later

T-SQL Tuesday Retrospective #007: Summertime in the SQL from Blog Posts - SQLServerCentral

Anonymous
09 Dec 2020
1 min read
This is the seventh post in my retrospective attempt to answer every T-SQL Tuesday invitation. In the beginning of June 2010, Jorge Segarra invited us to write about our favourite hot new feature in SQL Server 2008 or 2008 R2. For me, this would be the introduction of the DATE, TIME, and DATETIME2 data types-> Continue reading T-SQL Tuesday Retrospective #007: Summertime in the SQL The post T-SQL Tuesday Retrospective #007: Summertime in the SQL appeared first on Born SQL. The post T-SQL Tuesday Retrospective #007: Summertime in the SQL appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 910
article-image-daily-coping-9-dec-2020-from-blog-posts-sqlservercentral
Anonymous
09 Dec 2020
1 min read
Save for later

Daily Coping 9 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
09 Dec 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to give kind comments to as many people as possible today. I try to write these a few days in advance, so I’m going to repeat a few things I posted to others today. These were random tweets, messages, texts, etc. “That was a great presentation” “That code looks great” “Excellent way to express that” “You look happy today” “Your hair looks good” “That message really made my day. Thanks” The post Daily Coping 9 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 693

article-image-open-letter-to-the-pass-membership-from-blog-posts-sqlservercentral
Anonymous
08 Dec 2020
2 min read
Save for later

Open Letter to the PASS Membership, from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
2 min read
When I was appointed to serve as a Director at Large for the PASS Organization a year ago, it was with great passion, excitement and enthusiasm. I had some great ideas about how I could help globalize, diversify and improve our wonderful community that I’ve been a member of since 2013. The PASS Community has provided me with friends and the opportunity to connect, share and learn.  I wanted to serve our community but have realised sadly that I cannot.  It is with a heavy heart that I am delivering this message today. On Monday 7th December 2020 (Pacific Time) I resigned from the PASS Board.  I had considered resigning from the board back in July this year, but wanted to continue and try and serve our community by continuing on the board. I wanted to bring change – I campaigned on it when running for the board this year. Ultimately I could not bring the change that I feel we needed and so this is the reason I am stepping down. To let others bring what they feel our community needs. I want to thank the community for the support they have given me. This was a very hard decision to make, but I’m a give it a 100% person and I tried. #KiaKaha PASS The post Open Letter to the PASS Membership, appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 772

article-image-t-sql-tuesday-133-what-else-have-i-learned-from-presenting-from-blog-posts-sqlservercentral
Anonymous
08 Dec 2020
2 min read
Save for later

T-SQL Tuesday #133: What (Else) Have I Learned from Presenting? from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
2 min read
T-SQL Tuesday is a monthly blog party in the SQL Community and this month is hosted by Lisa Bohm (b | t) and it’s about what else I have learned through presenting.  It has to be something technical did not relate to the presentation we were preparing.  We should all right about learning PowerPoint (jk) although I did learn to put on subtitles to get closed captioning. What I learned unintentionally is more about using Linux than I ever intended to know at that point and time.  We had implemented the TIG (Telgraf/InfluxDB/Grafana) stack at work but I wasn’t the one who implemented but had talked to the person who had about using in presentations and they were OK if I did, but Linux was something I hadn’t used in like 20 years. So first off, what flavor of Linux was I going to use.  So I took to Pluralsight to learn some basics of Linux taking the path down the RedHat certification path.  Where the instructor was using CentOS because it was free and close to RedHat in commands.  Wow that was a learning curve, everything was a command prompt taking me back to seventh grade and MSDOS 3.3 that I learned back then. First, I installed Influxdb and Grafana, got Telegraf going on my SQL Server instance but umm couldn’t even load the interface for Grafana to discover when you install a program on Linux unlike windows the services aren’t set to automatically set to start and not only that the ports aren’t open on the firewall.  Everything is locked down, which is probably a good thing.  But wow a lot of Googling happened. Then there was editing the config files for Telgraf on a Linux SQL Instance.  I had to learn how to navigate VIM.  Boy that brought back memories but none that helped me remember how to use it. All be said and done I need to learn some Linux lots of it to proficient in it.  So maybe I should go back and finish those Pluralsight classes, but I also want to learn containers and get Azure certified, geez which thing can I do?     The post T-SQL Tuesday #133: What (Else) Have I Learned from Presenting? first appeared on Tracy Boggiano's SQL Server Blog. The post T-SQL Tuesday #133: What (Else) Have I Learned from Presenting? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 687
article-image-daily-coping-8-dec-2020-from-blog-posts-sqlservercentral
Anonymous
08 Dec 2020
2 min read
Save for later

Daily Coping 8 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to contact someone you can’t be with to see how they are doing. One of the things I did early on in the pandemic was reach out every day or two to a few random people in my contact list. The pace of things slowed down across the summer, but I decided to have a few more in-depth conversations, rather than just a “how are you” query. I opened up a Facebook Messenger conversation with a friend recently, reaching out to see what they were up to, and how life was going. Across a few days, we exchanged numerous messages, touching base and having a conversation around the rest of our busy lives. On one hand I enjoyed the chance to reach out to someone and contact them. It was good to catch up and see how life was getting along. On the other, this brought some sadness, as I had planned on seeing this person across the summer, which didn’t happen. With the prospect of months more of living like this, I’m disheartened that I won’t see this person for awhile. Not a great coping day for me. The post Daily Coping 8 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 777

article-image-t-sql-tuesday-133-what-i-learned-from-presenting-from-blog-posts-sqlservercentral
Anonymous
08 Dec 2020
3 min read
Save for later

T-SQL Tuesday #133–What I Learned From Presenting from Blog Posts - SQLServerCentral

Anonymous
08 Dec 2020
3 min read
This month Lisa Griffin Bohm is the host and thanks to her from hosting. She was one of the last people I pressured lightly to host. She came up with a very creative invite. She is asking us to share something technical that we learned, that wasn’t related to the actual presentation. While I’ve presented many times on various topics, and seen many more talks, I often go for a specific reason. At work it might be because I need to go, but at events, I usually pick a session based on the topic or presenter. Since most do a good job talking about their topic, I had to really think about this one. A Couple Choices I’ve got two that came to mind as I pondered what wasn’t related to the topic. One is small, with minor impact to my work. The other had a much larger impact to me. The first is a story where I was researching for a talk on Always Encrypted in 2016, I ran into an issue I hadn’t expected. This was Microsoft’s first big change to encryption options since 2005, and I was excited about it. I knew about the restrictions with data types, collation, etc, but they seemed to be acceptable to me. As I was building demos and working on how to show certificate movement, I created certificates in various ways, I created some in text files using the common PFX format. Little did I know that SQL Server can’t read these. Instead, you need to convert them to PVK, which isn’t well known. Many people use the .cer or .pfx, but for whatever reason, SQL Server doesn’t support those. The second story was while rehearsing for a talk at Build. Once again, I was speaking, but delivering only a small piece of a multi-person talk with a few Microsoft employees. Donovan Brown was one of them, and as we worked on timing, transitions, and tailored my section to fit, we had time to chat a bit. This was in the mid days of Visual Studio Team Services, which become Azure DevOps a short time later. As I was talking about some of the challenges I’d had in TFS, Donovan showed me a Java app he was maintaining as a side project, which was being built, tested, and released from VSTS. I was surprised, as Microsoft was still mostly focused on their own technology. He showed me some of the any language, any platform, any framework philosophy that was being used to make Azure DevOps a complete platform, not a Microsoft one. I was surprised, and I’ve continued to be as I watch new capabilities and features appear, few of which are tied to Microsoft products. That greatly impacted my career, and continues to do so today as I work with more and more customers that use non-Microsoft technologies. The post T-SQL Tuesday #133–What I Learned From Presenting appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 715