Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-etl-antipattern-no-error-handling-logic-from-blog-posts-sqlservercentral
Anonymous
16 Dec 2020
1 min read
Save for later

ETL Antipattern: No Error Handling Logic from Blog Posts - SQLServerCentral

Anonymous
16 Dec 2020
1 min read
I usually avoid talking about technology in absolutes, but here’s one that I can share without reservation: On a long enough timeline, every single ETL process will eventually fail. Is your ETL design built to handle a failure? I see far too many SSIS packages, ADF data factories, and other data movement applications built with the assumption of 100% success,... The post ETL Antipattern: No Error Handling Logic appeared first on Tim Mitchell. The post ETL Antipattern: No Error Handling Logic appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 886

article-image-t-sql-tuesday-121-gifts-received-for-this-year-redux-2020-from-blog-posts-sqlservercentral
Anonymous
16 Dec 2020
4 min read
Save for later

T-SQL Tuesday #121: Gifts received for this year Redux 2020 from Blog Posts - SQLServerCentral

Anonymous
16 Dec 2020
4 min read
At the end of 2019, Mala (b|t) invited us to write about the gifts we’ve gotten during the year.  I’ve decided to try to make this yearly habit.  This year with COVID I know it’s hard to be as thankful with lockdowns and not being able to see love ones and holidays not being the same but I think we all still have something we can be thankful and see as a gift.  I’ve been gifted with things this year personally and professionally.  So to honor the 12 days of Christmas here are 12 gifts I have received. Here is the post from 2019. My physical health, with COVID going around everywhere that is very important, although my doctor suspects I had in February before it was known thing to get tested for.  Other than that two weeks I have been perfectly healthy. My mental health has taken some bumpy roads this year as I don’t like being cut off from people, but I have a therapist and some amazing #SQLFamily that have helped during and after our Happy Hour and well anytime.  I’m well medicated and adjusted now and with the therapy I’ve gotten over the last year, the holidays aren’t bothering me and tanking me into depression. I’ve advocated for five different children this year, still advocating for two of them through the North Carolina Guardian ad Litem program, putting me up to I 54 kids in 17 years. Read more about that here.  This is absolutely my favorite job, the DBA job just pays me money so I can volunteer for things like this.  Who can have enough volunteer jobs, not me.  I’m up to three outside the SQL Community.  See my newest on at #12. I had the pleasure of speaking at PASS Summit for the fourth year in a row and on Mental Health this time.  I got greater reviews which you can read here and my highest score ever.  I spoke at several user groups and about 13 SQLSaturdays again.  Something I never take for granted.  When I first started talking on Mental Health I meet some resistance and even got a really nasty evaluation from an attendee but now I get user group leaders asking to give the presentation because COVID has made the times more stressful for people.  If anybody would like for me to present this to your group or conference just ping, email and Twitter at the bottom of the page. I’ve mentored a couple people along the way career wise and to be speakers, that seems like a gift for them but really is a gift for me as I like giving more than receiving. After Rob Sewell helped me learn how to write checks for dbachecks, I set the world on fire writing dbachecks for CIS compliance and may have added a couple of more based on requests. I’ve embraced my colorful hair self having orange hair, then red/blue hair, now yellow/purple and about to be red/green for Christmas. #SQLFamily has been a tremendous rock in life of people that I can count.  I had a situation in May where I was able to reach out to members and they were able to keep me sane. My job, I’ve been financially secure despite the dramatic shift in economy in the US. I was awarded the PASS PASSion Award at Summit this year for my efforts in the SQL Community. I was awarded the MVP Data Platform Award in August. I am starting to volunteer with NAMI in my county in NC.  To help give more talks about personal experience with mental illness, education, and maybe with their website. Along the way I’ve acquired many more people as part of my #sqlfamily, personal family, and to my core value of making a difference.  I hope every had a blessed Christmas and has as many gifts for the year to be thankful for as I do. Despite the difficult year I would like to encourage everyone to look for the gifts they have in life no matter how small they are.  Bring on 2021 and vaccines and let’s see what gifts we find next year. The post T-SQL Tuesday #121: Gifts received for this year Redux 2020 first appeared on Tracy Boggiano's SQL Server Blog. The post T-SQL Tuesday #121: Gifts received for this year Redux 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 864

article-image-announcing-the-calgary-data-user-group-from-blog-posts-sqlservercentral
Anonymous
16 Dec 2020
1 min read
Save for later

Announcing the Calgary Data User Group from Blog Posts - SQLServerCentral

Anonymous
16 Dec 2020
1 min read
In 2018 I started a new user group called the Calgary Data User Group, and hosted one session called “The Ethics of Machine Learning.” It was well-attended and its format of a discussion group (as opposed to a regular slide show with demos) was well-received. At the end of 2019, the founder of the Calgary-> Continue reading Announcing the Calgary Data User Group The post Announcing the Calgary Data User Group appeared first on Born SQL. The post Announcing the Calgary Data User Group appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 860

article-image-daily-coping-16-dec-2020-from-blog-posts-sqlservercentral
Anonymous
16 Dec 2020
2 min read
Save for later

Daily Coping 16 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
16 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to listen wholeheartedly to others without judging them. I was talking with a friend recently and they were struggling with their workload. They complained about the amount of work, and the dates things were due, while also noting they were struggling to focus and get things done. In the past, I’d have suggested something, or maybe questioned how they were planning on doing the work. I might even try to get them to see where they weren’t taking advantage of the time they did have and resources available. In other words, I would have been my typical Type-A, let me solve your problems person. Instead, I did none of that. I sympathized, empathized, and wished them luck, while asking if I could help. Sometimes I think venting and talking aloud helps someone sort through their thoughts, and my wife has helped me to learn how to be a sounding board without offering my own input. The post Daily Coping 16 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 716

article-image-on-table-variable-row-estimations-from-blog-posts-sqlservercentral
Anonymous
15 Dec 2020
3 min read
Save for later

On table variable row estimations from Blog Posts - SQLServerCentral

Anonymous
15 Dec 2020
3 min read
At first glance, the question of how many rows are estimated from a table variable is easy. But, is it really that simple? Well, not really. To dig into the why, first we need to identify why table variables estimate 1 row. The obvious answer is because they don’t have statistics. However… ALTER DATABASE CURRENT SET AUTO_CREATE_STATISTICS OFF GO CREATE TABLE Test (SomeCol INT); INSERT INTO Test (SomeCol) VALUES (1),(22),(37),(45),(55),(67),(72),(86),(91) SELECT SomeCol FROM Test SELECT * FROM sys.stats WHERE object_id = OBJECT_ID('Test') DROP TABLE dbo.Test That table has no statistics, but it still estimates rows correctly. So it’s not just the absence of statistics. Hmmm… Where else is there a difference with a table variable? It has to do with when the plans are generated. The XE event results are from an event tracking statement start and end and the post-compilation event for the plan. For the query using the table variable, the entire batch is compiled before the execution starts. For the permanent table, there are multiple compilation events. And this is because of something called ‘deferred compile’. For the table variable, the entire batch is compiled at the start, at a time where the table variable does not exist, and because there are no statistics, no recompile is triggered after the insert. Hence, there cannot be any row estimation other than 1 row, because the table did not exist when the estimate was made. For the permanent table, the compilation of the query that uses the table is deferred until the query starts, not when the batch starts. Hence the plan for the query is generated after the table exists, after it’s been populated. That’s the difference here. Now, there’s still no statistics, and so there’s no way to get data distribution, but that’s not the only way to get information on the rows in the table. The Storage Engine knows how many rows are in the table, though data distribution isn’t known. Hence, with a table variable we can expect to see an estimated row count other than 1 any time the table variable exists before the query that uses it is compiled. That will happen when the table variable is a table-type parameter, when the query using it has the RECOMPILE option, and when SQL 2019’s deferred compile for table variables is in play. CREATE OR ALTER PROCEDURE TestRowEstimations @Input TestTableType READONLY AS SELECT SomeCol FROM @Input; DECLARE @Test TABLE (SomeCol INT); INSERT INTO @Test (SomeCol) VALUES (1),(22),(37),(45),(55),(67),(72),(86),(91); SELECT SomeCol FROM @Test; SELECT SomeCol FROM @Test OPTION (RECOMPILE); GO Table-valued parameter Normal select on compatibility mode 140 Normal select on compatibility mode 150 Select with OPTION(RECOMPILE) The post On table variable row estimations appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 685

article-image-daily-coping-15-dec-2020-from-blog-posts-sqlservercentral
Anonymous
15 Dec 2020
2 min read
Save for later

Daily Coping 15 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
15 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to notice when you’re hard on yourself or others and be kind instead. This is one of those skills I’ve worked on for years, maybe a decade. I have tried to learn how to balance acceptance with drive. I want to accept, or maybe just experience, a situation where I am not achieving or accomplishing enough. I need to do this without suppressing my drive, but rather more realistically viewing situations. I see many driven, type-A type people never willing to give up, and often chastising themselves to do more, to do better. Maybe the example for me that springs to mind if Michael Jordon. He’s amazing, likely the best ever, but a jerk. Not someone I’d want to emulate. I’d take a more balanced, a more polite approach instead. I’d rather be Tim Duncan, if I were a high achiever. But maybe I’d just be happy being Luol Deng, a semi-successful player, but not a huge star, but a nice guy. What I want to do is drive forward, in a way that balances all parts of my life with success. With my wife’s success. With the support and love I give my kids or friends. If I don’t accomplish something, I try to stop and realistically examine why. It might be I had other commitments, or no energy (which happens a lot in 2020). It might be I chose to do something else and didn’t have time. It might be because I was just being lazy or not putting in effort. The former items are places I give myself a big break. The latter, I try to think about how to do better, how would I do something different in the future in the same situation. I accept what happened, I experience it, and maybe feel disappointed, but I don’t chastise myself. I move forward. The post Daily Coping 15 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 714
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime
article-image-sql-puzzle-eight-queens-on-a-chess-board-from-blog-posts-sqlservercentral
Anonymous
15 Dec 2020
1 min read
Save for later

SQL Puzzle – Eight Queens on a Chess board from Blog Posts - SQLServerCentral

Anonymous
15 Dec 2020
1 min read
Here's a SQL Puzzle for the festive period. 2020 has been a year of many things but amongst it all, it has been the year of chess. The combined effects of the pandemic and The Queen's Gambit on Netflix have given rise to a level of interest in the game not seen for generations. So here for your enjoyment is a chess puzzle to solve using your favourite database language. The post SQL Puzzle – Eight Queens on a Chess board appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 938

article-image-retails-urgency-addressing-customer-and-inventory-needs-with-data-from-whats-new
Anonymous
14 Dec 2020
5 min read
Save for later

Retail’s urgency: Addressing customer and inventory needs with data from What's New

Anonymous
14 Dec 2020
5 min read
Jeff Huckaby Market Segment Director, Retail and Consumer Goods, Tableau Kristin Adderson December 14, 2020 - 8:37pm December 22, 2020 Retail already changed from being product to more customer-centric influenced by increasing omni-channel initiatives that encourage digital transformation. Then Covid-19 hit. What retailers learned is that they must be willing and able to adapt quickly to internal and external forces. The silver lining is at the heart of digital transformation and adaptability is data.   According to McKinsey & Company, due to Covid-19, companies accelerated digital transformation by seven years. Tableau observed this with our retail customers as most staff were forced to work remotely, curb-side service became a required option for customers, and innovative solutions were needed to protect the safety of employees and customers. And it is no surprise that digital commerce exploded with an increased desire to shop online and limit face-to-face interactions. According to Salesforce, a new global record was hit on Black Friday for digital revenue with over $60B in online spend, a growth of 30 percent over last year.   As our retail and consumer goods customers focus on wrapping up the holiday shopping season and a tumultuous year, we wanted to give them an early preview of an upcoming whitepaper, releasing in January 2021. The visualized data will address common but nagging issues like in-stock position and product availability, online customer journey, competitive pricing, supply chain optimization, and loyalty program analysis, among others. Let’s explore visual analyses that reveal critical inventory and customer location insights, which lead to better site location and marketing opportunities.     On-Shelf Availability Dashboards Photo by Jeff Huckaby at a grocery store on March 11, 2020.Empty shelves were a typical scene in March, posing problems for stores and customers. Need toilet paper or baby formula? There was none. They flew off the shelves as quickly as they were stocked. These dashboards connect inventory and availability to grocers, suppliers, stores, and warehouses, so the fast-moving consumer goods (FMCG) industry can act to eliminate out-of-stocks. Here’s to more availability of toilet paper in 2021! “Stock in Trade” Dashboard This visual analysis created by Tableau partner, Atheon Analytics, helps retailers and their suppliers quickly and easily see where inventory is under- or over-stocked by grocer and store location. As a supplier, further examine product availability in warehouses (depots in the UK) to know where stock must be allocated, ensuring availability at certain stores. Unifying retailers, suppliers, and manufacturers around this near real-time data is essential going forward to support constantly changing customer demands.   In the next example, see the product data, category, or sub-category rolled up to the individual grocer. Visualized on the right is current demand compared with stock levels, so you know when you are approaching dangerously low or no inventory to support customers.  Atheon Analytics brings together this critical information from suppliers and retailers in Snowflake to effectively work from one operational canvas and act in unison.  Customer Location and Site Selection Dashboards With lockdowns and work from home mandates leading to a reduction in commuting, many retailers observed a dramatic change in customer flows. They should take a fresh, on-going look at current customer location data and competitors to quickly and confidently know the changing dynamics of their local markets and how customer composition changes throughout the day.  Leading that charge is Tableau partner, Lovelytics, which created a “Customer Location and Site Selection” dashboard, powered by global location provider, Foursquare. It analyzes the Foursquare Visits data feed using geospatial analysis, offers an option to add your own customer demographics and traffic data, and enables businesses to pinpoint an optimal site for opening or where to use an existing location, helping inform customer marketing and targeting.  Evaluate via spatial analysis the number of visitors, the amount of foot traffic, and how the flow of customers changes. This information could easily be combined with real-time sales and loyalty data, and allow for restaurants, in this example, to use Salesforce Einstein for creating a churn analysis, predicting customers they may lose, and knowing when to activate a new retention campaign within Salesforce Marketing Cloud.    This location view specifically analyzes more than 1.3 million site visits to various restaurant chains in the Denver, Colorado area with the option to look closely by store location, day, and hour. In Tableau, it is easy to “playback” how local areas are changing and how that impacts existing stores.  It is also an incredible way to ensure new site selection won’t cannibalize existing locations and that you allocate the correct labor to offer a safe, high-quality experience for customers. Benefits of inventory and customer clarity for retail Demystifying inventory availability and ensuring grocers, suppliers, and warehouses (or depots) are aligned ensures that the right inventory gets to the right stores as customer demands and traffic change on a dime. This same data can help remove the guesswork with new store construction builds or help prioritize remodels.   We look forward to sharing the remaining dashboards next month—and all interactive examples will be free to access on Tableau Public. Have a very safe and enjoyable holiday season!     Join the discussion  Join over 3,500 retail and consumer goods customers to discuss and talk about retail analytics, ask questions, and provide help.     About the Partners We want to thank Atheon Analytics and Lovelytics for their participation. To learn more about these incredible examples highlighted, please connect with them.
Read more
  • 0
  • 0
  • 995

article-image-setting-defaults-for-new-sql-compare-projects-from-blog-posts-sqlservercentral
Anonymous
14 Dec 2020
3 min read
Save for later

Setting Defaults for New SQL Compare Projects from Blog Posts - SQLServerCentral

Anonymous
14 Dec 2020
3 min read
Recently I wrote about ignoring comments in SQL Compare. That seems like something I want to do in all my projects, so I went looking for how to set this as a default. It wasn’t obvious to me, but since I can ping the Redgate Software developers and support staff, I found an answer for this post. If I start SQL Compare by clicking a project, it opens and my settings are there. When I start SQL Compare, it brings up the New Project dialog, shown here: However, if I don’t want a new project, and close that, I have a basic interface. Going through the menus, I don’t see any way to set global options. There are Application Options, but these aren’t anything to do with projects. However, I realized that I missed something a support person pointed out to me. On the options tab for any project is a button that says “Save as my defaults”. I can set things from any project, or I can click the “My Projects” button in the toolbar, which gives me a list of projects (I don’t save many). At the bottom of this is an “edit” button, which opens the familiar project dialog. I can click Options from there and see options. When I do that, I can now set the items that I care about. The Ignore Comments is one, but there a few others I think cause issues. I do want to ignore encryption objects. I shouldn’t have the same ones  in dev as prod, so I’ll check that. Others: Use database compat level (likely better than server version these days) Online = ON Ignore identity seed and increment (this could get changed) Ignore WITH ENCRYPTION I can then click the “Save as my defaults”. This will ensure new projects have these options. If I have old projects I want to update, I can always click the “My defaults” in the project to load what I’ve saved. This animation shows how the button works. Now I can easily ensure that projects work as expected on my machine. Unfortunately, these defaults are stored in the registry, so this is a client by client basis, but you can ensure all your projects are set up the same by default. SQL Compare is a fantastic product to make it easy to see what has changed in a database. If you’ve never used it, give it a try today. The post Setting Defaults for New SQL Compare Projects appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 849

article-image-getting-started-reading-execution-plans-highest-cost-operator-from-blog-posts-sqlservercentral
Anonymous
14 Dec 2020
1 min read
Save for later

Getting Started Reading Execution Plans: Highest Cost Operator from Blog Posts - SQLServerCentral

Anonymous
14 Dec 2020
1 min read
Reading execution plans in SQL Server is just hard. There’s a lot to learn and understand. I previously outlined the basics I use to get started when I’m looking at an execution plan for the first time. However, just those pointers are not enough. I want to explain a little further why and how those […] The post Getting Started Reading Execution Plans: Highest Cost Operator appeared first on Grant Fritchey. The post Getting Started Reading Execution Plans: Highest Cost Operator appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 757
article-image-daily-coping-14-dec-2020-from-blog-posts-sqlservercentral
Anonymous
14 Dec 2020
2 min read
Save for later

Daily Coping 14 Dec 2020 from Blog Posts - SQLServerCentral

Anonymous
14 Dec 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. All my coping tips are under this tag.  Today’s tip is to do something helpful for a friend or family member. This one is for my wife, her employee, my daughter, and for me. I installed an electric winch last winter to raise and lower the arena door. We use this almost daily to move hay in and out. Someone ran it too far and bent the drum axle. As a result, the winch didn’t work and I had to go out late one night, and hook up the manual winch again. A pain for me one night, but an ongoing pain for everyone. It’s been broken for a week, and it’s been cold, so I haven’t worked on it. However, I ordered a new winch a few days back and when it arrived, I took a few hours to remount it and hook it up so that everyone can enjoy the convenience of electricity rather than the 70 turns the crank takes to open the door. The post Daily Coping 14 Dec 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 784

article-image-etl-antipattern-performing-full-loads-instead-of-incremental-loads-from-blog-posts-sqlservercentral
Anonymous
14 Dec 2020
1 min read
Save for later

ETL Antipattern: Performing Full Loads Instead of Incremental Loads from Blog Posts - SQLServerCentral

Anonymous
14 Dec 2020
1 min read
In my last post in the ETL Antipatterns series, I wrote about the common antipattern of ingesting or loading more data than necessary. This brief post covers one specific case of loading more data than necessary by performing a full data load rather than using a smaller incremental load. ETL Antipattern: performing full loads instead of incremental loads Earlier this... The post ETL Antipattern: Performing Full Loads Instead of Incremental Loads appeared first on Tim Mitchell. The post ETL Antipattern: Performing Full Loads Instead of Incremental Loads appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 690

article-image-introduction-to-sql-server-query-store-from-blog-posts-sqlservercentral
Anonymous
14 Dec 2020
1 min read
Save for later

Introduction to SQL Server Query Store from Blog Posts - SQLServerCentral

Anonymous
14 Dec 2020
1 min read
A quick exploration of Query Store, the most anticipated feature of SQL 2016. The post Introduction to SQL Server Query Store appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 887
article-image-best-of-the-tableau-web-november-from-whats-new
Anonymous
11 Dec 2020
3 min read
Save for later

Best of the Tableau Web: November from What's New

Anonymous
11 Dec 2020
3 min read
Andy Cotgreave Technical Evangelist Director, Tableau Kristin Adderson December 11, 2020 - 11:09pm December 12, 2020 Hello everyone and welcome to the latest round up of the Tableau community highlights. I was reminded this month of how important “success” in analytics is about much more than one’s skills in the platform. This month, as always, the community has shared many super tips and tricks to improve your ability to master Tableau, but there has been a great set of posts on all the other career development “stuff” that you mustn’t ignore if you want to succeed.  Judit Bekker’s latest post describes how she found a job in analytics. Her story contains great advice for anyone setting out on this wonderful career journey.  Ruth Amarteifio from The Information Lab describes how to ask the right questions before embarking on a data project. Believe me, these are great questions that I wish I had known before I started my career in analytics. Helping grow a community is a great means to develop your network and open yourself to new opportunities. What better way than starting a Tableau User Group? Interworks has a great list of ideas to inspire you and help you get started on the right path.  If those aren’t enough, then you must head to Adam Mico’s blog where he curated reflections from 129 different people (!) from the Tableau community. There are so many great stories here. Read them all or dip into a few, and you’ll find ideas to help you build your own career in analytics, regardless of which tool or platform you end up using.   As always, enjoy the list! Follow me on Twitter and LinkedIn as I try and share many of these throughout the month. Also, you can check out which blogs I am following here. If you don’t see yours on the list, you can add it here. Tips and tricks Andy Kriebel #TableauTipTuesday: How to Sort a Chart with a Parameter Action Luke Stanke Beyond Dual Axis: Using Multiple Map Layers to create next-level visualizations in Tableau    Marc Reid Tableau Map Layers Inspiration Adam Mico The #DataFam: 128 Authors From 21 Countries Spanning Six Continents Share Their 2020 Tableau… Bridget Cogley Data Viz Philosophy: Better than Bar Charts Adam McCann Layering Multiple Charts in Tableau 2020.4 Mark Bradbourne Real World Fake Data – Human Resources Recap Lindsay Betzendahl Visualizing COVID-19: March’s #ProjectHealthViz and the Impact of Pushing Boundaries Formatting, Design, Storytelling Evelina Judeikyte  Three Design Principles for More Effective Dashboards Ken Flerlage Creating a Basic Beeswarm Plot in Tableau Adam McCann Animated Buttons and Animated Night/Day Toggle Prep Tom Prowse Tableau Prep vs Einstein Analytics – Combining and Outputs Server Mark Wu Difference between ‘suspend extract refresh task’ vs ‘tag stale content’ feature Set and Parameter Actions Kevin Flerlage Dynamically Show & Hide Parameters & Filters based on another Parameter Selection Ethan Lang  3 Essential Ways to Use Dynamic Parameters in Tableau
Read more
  • 0
  • 0
  • 1253

article-image-toolbox-when-intellisense-doesnt-see-your-new-object-from-blog-posts-sqlservercentral
Anonymous
11 Dec 2020
2 min read
Save for later

Toolbox - When Intellisense Doesn't See Your New Object from Blog Posts - SQLServerCentral

Anonymous
11 Dec 2020
2 min read
I was just working on a new SQL job, and part of creating the job was adding a few new tables to our DBA maintenance database to hold data for the job.  I created my monitoring queries, and then created new tables to hold that data  One tip - use SELECT...INTO as an easy way to create these types of tables - create your query and then add a one-time INTO clause to create the needed object with all of the appropriate column names, etc. https://i.redd.it/1wk7ki3wtet21.jpg SELECT DISTINCT SERVERPROPERTY('ServerName') as Instance_Name , volume_mount_point as Mount_Point , cast(available_bytes/1024.0/1024.0/1024.0 as decimal(10,2)) as Available_GB , cast(total_bytes/1024.0/1024.0/1024.0 as decimal(10,2)) as Total_GB , cast((total_bytes-available_bytes)/1024.0/1024.0/1024.0 as decimal(10,2)) as Used_GB , cast(100.0*available_bytes/total_bytes as decimal(5,2)) as Percent_Free , GETDATE() as Date_Stamp INTO Volume_Disk_Space_Info FROM sys.dm_io_virtual_file_stats(NULL, NULL) AS vfs       INNER JOIN sys.master_files AS mf WITH (NOLOCK)       ON vfs.database_id = mf.database_id AND vfs.file_id = mf.file_id   CROSS APPLY sys.dm_os_volume_stats(mf.database_id, mf.FILE_ID)    order by volume_mount_point I thought at this point that everything was set, until I tried to write my next statement... The dreaded Red Squiggle of doom! I tried to use an alias to see if Intellisense would detect that - no luck. Some Google-Fu brought me to the answer on StackOverflow - there is an Intellisense cache that sometimes needs to be refreshed. The easiest way to refresh the cache is simply a CTRL-SHIFT-R, but there is also a menu selection in SSMS to perform the refresh: Edit>>Intellisense>>Refresh Local Cache In my case, once I performed the CTRL-SHIFT-R, the red squiggles disappeared! https://memegenerator.net/img/instances/61065657/you-see-its-magic.jpg Hope this helps! The post Toolbox - When Intellisense Doesn't See Your New Object appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 839