Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News

3711 Articles
article-image-pass-virtual-summit-2020-im-virtually-presenting-from-blog-posts-sqlservercentral
Anonymous
04 Nov 2020
3 min read
Save for later

PASS Virtual Summit 2020: I'm (Virtually) Presenting from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
3 min read
I'll be presenting at 7AM Central Time in the first timeslot of the main three-day PASS Virtual Summit 2020 conference. It's long been a goal of mine to present at the best of all international SQL conferences, and this is the first year it happened for me, so I'm thrilled to be a part of it. It's not too late to register for the all-online event, with the same great quality content as always, at a fraction of the usual cost of going to Seattle. Like many (but not all) presentations at PASS Virtual Summit, my 75-minute presentation will feature roughly 60 minutes of pre-recorded (and painstakingly edited) content, with the rest of the time available for live Q&A with the speaker.  My presentation will cover a lot of important foundational material about security, accounts, authentication.  For folks new to SQL Server security design and administration, this will be a great foundation for your learning.  For those experienced in SQL admin, this will be a thorough evaluation of what you know, or thought you know, and maybe some gaps in what you know.  I think there is content in here to interest everyone in the SQL career lifecycle, and I’m not just guessing at that. I got my first DBA job in 2006. I’ve been giving a presentation on Security at User Groups and SQLSaturdays basics for years, it was one of the first topics I started speaking technically on a decade ago. As my own experience has deepened and broadened throughout my career, so has the content I build into this presentation.  So I’m going to start basic, and build quickly from there, focusing my content around common hurdles and tasks that database administrators face, in the hopes of deepening or broadening your experience, as well.  I'm setting the stage for a good conversation around security at PASS Virtual Summit 2020, especially around how permissions behave inside each database, how you can design database security, the relationships between logins, users, and databases. My session one of a four part Learning Pathway on security. We worked together over the past four months to make sure we're presenting a thorough conversation on security.  In subsequent presentations over the next three days: John Morehouse is presenting on Understanding Modern Data Encryption Offerings for SQL Server, including a lot more information on all the various sorts of encryption, plus some important security features of SQL Server like Transparent Data Encryption and Always Encrypted. Jeff Renz is presenting on Securing Your Data In Azure: Tips and Tricks, which covers Advanced Threat Detection, the Azure Key Vault, cloud secure connection strings, and certified data sets for security in PowerBI. Ed Leighton-Dick will cap it off with a presentation on Building a Security Dashboard for SQL Server, talking about when certs expire and what does that actually mean, more on Azure Advanced Threat Detection, SQL Audit and monitoring. The post PASS Virtual Summit 2020: I'm (Virtually) Presenting appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 653

article-image-consider-the-benefits-of-powershell-for-developer-workflows-from-blog-posts-sqlservercentral
Anonymous
04 Nov 2020
7 min read
Save for later

Consider the Benefits of Powershell for Developer Workflows from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
7 min read
Who Am I Talking To You use bash or python. PowerShell seems wordy, extra verbose, and annoying. It’s a windows thing, you say… why would I even look at it. Pry bash out of my fingers if yuo dare (probably not for you ??) What PowerShell Is The best language for automating Windows… period. A great language for development tooling and productivity scripts. One of the best languages for automation with interactivity. Python is fantastic. The REPL isn’t meant for the same interactivity you get with PowerShell. PowerShell prompt is sorta like mixing Python & fish/bash in a happy marriage. A rich language (not just scripting) for interacting with AWS using AWS.Tools. A rich object-oriented pipeline that can handle very complex actions in one-liners based on object-oriented pipelines. Intuitive and consistent mostly for command discovery. a common complaint from bash pros. The point of the verbosity Verb-Noun is discoverability. tar for example is a bit harder to figure out than Expand-Archive -Path foo -DestinationPath foo A language with a robust testing framework for unit, integration, infrastructure, or any other kinda testing you want! (Pester is awesome) What PowerShell Isn’t Python ?? Good at datascience. Succinct Meant for high-concurrency Good at GUI’s… but come-on we’re devs… guis make us weak ?? A good webserver Lots more. The Right Tool for the Job I’m not trying to tell you never to use bash. It’s what you know, great! However, I’d try to say if you haven’t explored it, once you get past some of the paradigm differences, there is a rich robust set of modules and features that can improve most folks workflow. Why Even Consider PowerShell As I’ve interacted more and more with folks coming from a mostly Linux background, I can appreciate that considering PowerShell seems odd. It’s only recently that it’s cross platform in the lifecycle of things, so it’s still a new thing to most. Having been immersed in the .NET world and now working on macOS and using Docker containers running Debian and Ubuntu (sometimes Alpine Linux), I completely get that’s not even in most folks purview. Yet, I think it’s worth considering for developer workflows that there is a lot of gain to be had with PowerShell for improving the more complex build and development workflows because of the access to .NET. No, it’s not “superior”. It’s different. Simple cli bash scripting is great for many things (thus prior article about Improving development workflow Task which uses shell syntax). The fundemental difference in bash vs PowerShell is really text vs object, in my opinion. This actually is where much of the value comes in for considering what to use. Go For CLI Tools Go provides a robust cross-platform single binary with autocomplete features and more. I’d say that for things such as exporting pipelines to Excel, and other “automation” actions it’s far more work in Go. Focus Go on tooling that makes the extra plumbing and stronger typing give benefit rather than just overhead. AWS SDK operations, serverless/lambda, apis, complex tools like Terraform, and more fit the bill perfectly and are a great use case. Scenario: Working with AWS If you are working with the AWS SDK, you are working with objects. This is where the benefit comes in over cli usage. Instead of parsing json results and using tools like jq to choose arrays, instead, you can interact with the object by named properties very easily. $Filters = @([Amazon.EC2.Model.Filter]::new('tag:is_managed_by','muppets') $InstanceCollection = (Get-EC2Instance -Filter $Filters)).Instances | Select-PSFObject InstanceId, PublicIpAddress,PrivateIpAddress,Tags,'State.Code as StateCode', 'State.Name as StateName' -ScriptProperty @{ Name = @{ get = { $this.Tags.GetEnumerator().Where{$_.Key -eq 'Name'}.Value } } } With this $InstanceCollection variable, we now have access to an easily used object that can be used with named properties. Give me all the names of the EC2 instances: $InstanceCollection.Name Sort those: $InstanceCollection.Name | Sort-Object (or use alias shorthand such as sort) For each of this results start the instances: $InstanceCollection | Start-EC2Instance Beyond that, we can do many things with the rich eco-system of prebuilt modules. Here are some example of some rich one-liners using the power of the object based pipeline. Export To Json: $InstanceCollection | ConvertTo-Json -Depth 10 | Out-File ./instance-collection.json Toast notification on results: Send-OSNotification -Title 'Instance Collection Results' -Body "Total results returned: $($InstanceCollection.Count)" Export To Excel with Table: $InstanceCollection | Export-Excel -Path ./instance-collection.json -TableStyle Light8 -TableName 'FooBar' Send a rich pagerduty event to flag an issue: Send-PagerDutyEvent -Trigger -ServiceKey foo -Description 'Issues with instance status list' -IncidentKey 'foo' -Details $HashObjectFromCollection Use a cli tool to flip to yaml (you can use native tooling often without much issue!): $InstanceCollection | ConvertTo-Json -Depth 10 | cfn-flip | Out-File ./instance-collection.yml Now build a test (mock syntax), that passes or fails based on the status of the instances Describe "Instance Status Check" { Context "Instances That Should Be Running" { foreach($Instance in $InstanceCollection) { It "should be running" { $Instance.StatusName | Should -Be 'Running' } } } } Now you have a test framework that you could validate operational issues across hundreds of instances, or just unit test the output of a function. Exploring the Object I did this comparison once for a coworker, maybe you’ll find it useful too! "Test Content" | Out-File ./foo.txt $Item = Get-Item ./foo.txt ## Examine all the properties and methods available. It's an object $Item | Get-Member This gives you an example of the objects behind the scene. Even though your console will only return a small set of properties back, the actual object is a .NET object with all the associated methods and properties. This means that Get-Item has access to properties such as the base name, full path, directory name and more. You can access the actual datetime type of the CreationTime, allowing you to do something like: ($item.LastAccessTime - $Item.CreationTime).TotalDays This would use two date objects, and allow you to use the relevant Duration methods due to performing math on these. The methods available could be anything such as $Item.Encrypt(); $Item.Delete; $Item.MoveTo and more all provided by the .NET namespace System.IO.FileInfo. I know many of these things you can do in bash as well, but the object pipeline here I’d wager provides a very solid experience for more complex operations based on the .NET framework types available. Wrap Up This was meant to give a fresh perspective on why some folks have benefited from PowerShell over using shell scripting. It’s a robust language that for automation/build/cloud automation can give a rich reward if you invest some time to investigate. For me the basic “right tool for the job” would like like this: data: python serverless: go & python (powershell can do it too, but prefer the others) web: go & python basic cli stuff: shell (using Task which uses shell syntax) complex cli project tasks: powershell & go automation/transformation: powershell & python high concurrency, systems programming: go Maybe this provided a fresh perspective for why PowerShell might benefit even those diehard shell scripters of you out there and maybe help convince you to take the plunge and give it a shot. #development #cool-tools #golang #automation The post Consider the Benefits of Powershell for Developer Workflows appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 655

article-image-pro-microsoft-power-platform-solution-building-for-the-citizen-developer-from-blog-posts-sqlservercentral
Anonymous
04 Nov 2020
2 min read
Save for later

Pro Microsoft Power Platform: Solution Building for the Citizen Developer from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
2 min read
Over the last several months a team of excellent authors, including myself, have been writing a very exciting new book about Microsoft’s Power Platform. We approached the publishing company Apress with an idea to produce a book that really tells the full story of how the Power Platform works together. As I’m sure you know, the Power Platform is actually 4 tools in one: Power Apps, Power Automate, Power BI and Power Virtual Agent. We found there were few books on the market that attempted to tell this full story. This book is designed for the “Citizen Developer” to help you feel confident in developing solutions that leverage the entire Power Platform. Are you a Citizen Developer? Citizen Developers are often business users with little or no coding experience who solve problems using technologies usually approved by IT. The concept of business users solving their own problems is not new but, what is new is the concept of doing it with IT’s blessing. Organizations have realized the power of enabling Citizen Developers to solve smaller scale problems so IT can focus larger more difficult problem. I hope you enjoy this new book and find it helpful in your Power Platform journey! The post Pro Microsoft Power Platform: Solution Building for the Citizen Developer appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 890

article-image-improving-local-development-workflow-with-go-task-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
6 min read
Save for later

Improving Local Development Workflow With Go Task from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
6 min read
Workflow Tooling Development workflow, especially outside of a full-fledged IDE, is often a disjointed affair. DevOps oriented workflows that often combine cli tools such as terraform, PowerShell, bash, and more all provide more complexity to getting up to speed and productive. Currently, there is a variety of frameworks to solve this problem. The “gold standard” most are familiar with in the open-source community would be Make. Considering Cross-Platform Tooling This is not an exhaustive list, it’s focused more on my journey, not saying that your workflow is wrong. I’ve looked at a variety of tooling, and the challenge has typically that most are very unintuitive and difficult to remember. Make…it’s everywhere. I’m not going to argue the merits of each tool as I mentioned, but just bring up that while cMake is cross platform, I’ve never considered Make a truly cross platform tool that is first class in both environments. InvokeBuild & Psake In the Windows world, my preferred framework would be InvokeBuild or PSake. The thing is, not every environment will always have PowerShell, so I’ve wanted to experiment with minimalistic task framework for intuitive local usage in a project when the tooling doesn’t need to be complex. While InvokeBuild is incredibly flexible and intuitive, there is an expectation of familarity with PowerShell to fully leverage. If you want a robust framework, I haven’t found anything better. Highly recommend examining if you are comfortable with PowerShell. You can generate VSCode tasks from your defined scripts and more. InvokeBuild & Psake aren’t great for beginners just needing to run some tooling quickly in my experience. The power comes with additional load for those not experienced in PowerShell. If you are needing to interact with AWS.Tools SDK, complete complex tasks such as generating objects from parsing AST (Abstract Syntax Trees) and other, then I’d lead towards InvokeBuild. However, if you need to initialize some local dependencies, run a linting check, format your code, get the latest from main branch and rebase, and other tasks that are common what option do you have to get up and running more quickly on this? Task I’ve been pleasantly surprised by this cross-platform tool based on a simple yaml schema. It’s written in go, and as a result it’s normally just a single line or two to immediately install in your system. Here’s why you might find some value in examining this. Cross-platform syntax using this go interpreter sh Very simple yaml schema to learn. Some very nice features that make it easy to ignore already built assets, setup task dependencies (that run in parallel too!), and simple cli interactivity. My experience has been very positive as I’ve found it very intuitive to build out basic commands as I work, rather than having to deal with more more complex schemas. Get Started version: 3 tasks: default: task --list help: task --list fmt: desc: Apply terraform formatting cmds: - terraform fmt -recursive=true The docs are great for this project, so I’m not going to try and educate you on how to use this, just point out some great features. First, with a quick VSCodee snippet, this provides you a quick way to bootstrap a new project with a common interface to run basic commands. Let’s give you a scenario… assuming you aren’t using an already built Docker workspace. I need to initialize my 2 terraform directories. I want to also ensure I get a few go dependencies for a project. Finally, I want to validate my syntax is valid among my various directories, without using pre-commit. This gets us started… version: 3 tasks: Next, I threw together some examples here. Initialize commands for two separate directories. A fmt command to apply standardized formatting across all tf files. Finally, wrap up those commands with a dep: [] value that will run the init commands in parallel, and once that is finished it will run fmt to ensure consistent formatting. version: '3' env: TF_IN_AUTOMATION: 1 tasks: init-workspace-foo: dir: terraform/foo cmds: - terraform init init-workspace-bar: dir: terraform/bar cmds: - terraform init fmt: desc: Recursively apply terraform fmt to all directories in project. cmds: - terraform fmt -recursive=true init: desc: Initialize the terraform workspaces in each directory in parallel. deps: [init-workspace-foo,init-workspace-bar] cmds: - task: fmt You can even add a task in that would give you a structured git interaction, and not rely on git aliases. sync: desc: In GitHub flow, I should be getting lastest from main and rebasing on it so I don't fall behind cmds: - git town sync Why not just run manually I’ve seen many folks online comments about why even bother? Can’t the dev just run the commands in the directory when working through it and be done with it? I believe tasks like this should be thrown into a task runner from the start. Yes, it’s very easy to just type terraform fmt, go fmt, or other simple commands… if you are the builder of that project. However: it increases the cognitive load for tedious tasks that no one should have to remember each time the project grows. It makes your project more accessible to new contributors/teammates. It allows you to simply moving to automation by wrapping up some of these automation actions in GitHub Actions or equivalent, but simply having the CICD tooling chosen run the same task you can run locally. Minimal effort to move it to automation from that point! I think wrapping up things with a good task runner tools considers the person behind you, and prioritizes thinking of others in the course of development. It’s an act of consideration. Choose the Right Tooling Here’s how I’d look at the choices: Run as much in Docker as you can. If simple actions, driven easily on cli such as build, formatting, validation, and other then start with Task from the beginning and make your project more accessible. If requirements grow more complex, with interactions with AWS, custom builds for Lambda, combined with other more complex interactions that can’t easily be wrapped up in a few lines of shell scripting… use InvokeBuild or equivalent. This gives you access to the power of .NET and the large module collection provided. Even if you don’t really need it, think of the folks maintaining or enabling others to succeed with contributions more easily, and perhaps you’ll find some positive wins there. ?? #development #cool-tools #golang #automation The post Improving Local Development Workflow With Go Task appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 636
Banner background image

article-image-external-tables-vs-t-sql-views-on-files-in-a-data-lake-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
4 min read
Save for later

External tables vs T-SQL views on files in a data lake from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
4 min read
A question that I have been hearing recently from customers using Azure Synapse Analytics (the public preview version) is what is the difference between using an external table versus a T-SQL view on a file in a data lake? Note that a T-SQL view and an external table pointing to a file in a data lake can be created in both a SQL Provisioned pool as well as a SQL On-demand pool. Here are the differences that I have found: Overall summary: views are generally faster and have more features such as OPENROWSET Virtual functions (filepath and filename) are not supported with external tables which means users cannot do partition elimination based on FILEPATH or complex wildcard expressions via OPENROWSET (which can be done with views) External tables can be shareable with other computes, since their metadata can be mapped to and from Spark and other compute experiences, while views are SQL queries and thus can only be used by SQL On-demand or SQL Provisioned pool External tables can use indexes to improve performance, while views would require indexed views for that Sql On-demand automatically creates statistics both for a external table and views using OPENROWSET. You can also explicitly create/update statistics on files on OPENROWSET. Note that automatic creation of statistics is turned on for Parquet files. For CSV files, you need to create statistics manually until automatic creation of CSV files statistics is supported Views give you more flexibility in the data layout (external tables expect the OSS Hive partitioning layout for example), and allow more query expressions to be added External tables require an explicit defined schema while views can use OPENROWSET to provide automatic schema inference allowing for more flexibility (but note that an explicitly defined schema can provide faster performance) If you reference the same external table in your query twice, the query optimizer will know that you are referencing the same object twice, while two of the same OPENROWSETs will not be recognized as the same object. For this reason in such cases better execution plans could be generated when using external tables instead of views using OPENROWSETs Row-level security (Polybase external tables for Azure Synapse only) and Dynamic Data Masking will work on external tables. Row-level security is not supported with views using OPENROWSET You can use both external tables and views to write data to the data lake via CETAS (this is the only way either option can write data to the data lake) If using SQL On-demand, make sure to read Best practices for SQL on-demand (preview) in Azure Synapse Analytics I often get asked what is the difference in performance when it comes to querying using an external table or view against a file in ADLS Gen2 vs. querying against a highly compressed table in a SQL Provisioned pool (i.e. managed table). It’s hard to quantify without understanding more about each customers scenario, but you will roughly see a 5X performance difference between queries over external tables and views vs. managed tables (obviously, depending on the query, that will vary but that’s a rough number – could be more than 5X in some scenarios). A few things that contribute to that: in-memory caching, SSD based caches, result-set caching, and the ability to design and align data and tables when they are stored as managed tables. You can also create materialized views for managed tables which typically bring lots of performance improvements as well. If you are querying Parquet data, that is in a columnstore file format with compression so that would give you similar data/column elimination as what managed SQL clustered columnstore index (CCI) would give, but if you are querying non-Parquet files you do not get this functionality. Note that for managed tables, on top of performance, you also get a granular security model, workload management capabilities, and so on (see Data Lakehouse & Synapse). The post External tables vs T-SQL views on files in a data lake first appeared on James Serra's Blog. The post External tables vs T-SQL views on files in a data lake appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1130

article-image-fundamental-information-for-azure-open-source-databases-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
1 min read
Save for later

Fundamental Information for Azure Open Source Databases from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
1 min read
As all of us know that Microsoft Azure Supporting many different database types such as Azure SQL, Azure Cosmos DB, and Azure support also MariaDB, MySQL, and PostgreSQL so it easily to migrate your current database (MariaDB, MySQL, and PostgreSQL) to Azure , in this post i will share Fundamental information about MariaDB, MySQL, and … Continue reading Fundamental Information for Azure Open Source Databases The post Fundamental Information for Azure Open Source Databases appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 840
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-sql-homework-november-2020-help-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
2 min read
Save for later

SQL Homework – November 2020 – Help! from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
2 min read
If you ask any senior IT person What is the most important tool you have? there is a decent chance that they’ll tell you something along the lines of Google, Bing, or Books on Line. This month I’d like you to spend some time looking at one of the help pages provided by Microsoft. Specifically take a look at the help page(s) for SELECT. I haven’t done this in a while but here are the tasks and how many points they are worth. Yes, you can get more than 100 points. You can use the extra to help on a previous grade. Review the Syntax section. Notice that there are two different areas and what they are for. (5pts) Some clauses are required, some aren’t. How do you tell? (5pts) Some clauses can be repeated multiple times. How do you tell? (5pts) What does <table_source> mean and where can you go to get details on it? (10 pts) What does it mean when you see ::=? (5 pts) Follow links in the Remarks section. For example the FROM clause. (5pts each.) (No more than 20pts for this section.) Skip to the Examples section without looking at the Permissions section. (5 points) Review each example (5 pts each) (Can be done on the pages from following the Remarks links.) (No more than 20pts for this section.) Look at the See Also section. Follow some of the links. (10 pts) Go to the Select Clause page and look at the Arguments section. (15 pts) Read the Remarks section of the Select Clause page. (10 pts) The post SQL Homework – November 2020 – Help! appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 663

article-image-daily-coping-3-nov-2020-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
2 min read
Save for later

Daily Coping 3 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to think of three things that give you hope for the future. Today especially, election day in the US, it likely feels hope is elusive for many, but I don’t feel that way. I do have some anxiety, but it’s mostly for the immediate changes in my country, not the future beyond the next year. I do have hope and here are three things that I do cling to. Individuals are good Despite the disagreements I might have with many people about how specific issues, for the most part, I find that people I encounter in life, even in this crazy pandemic world, are good. They have compassion, empathy, and kindness towards others that stand in front of them. It’s important to remember that most of our world is still our live connections, not the digital ones. The Next Generations Have Values Beyond Themselves I don’t mean to imply that my generation doesn’t think this way, but I find that so many people overall think about themselves more so than anything else. They look at their situation, or their family’s, as much more important than anything else. They have short term thinking, in general. Despite the vapid, immediacy of how so many young people view the world, I also find them to more often be thinking about the wider world, other cultures, the planet, and more abstract items, with less concern about money. A vague generalization, but one I think is hopeful for the future of many aspects of the world. We Locked Down As much as a pain as it was, and as much disruption as it caused, regardless of effectiveness, I was amazed the world pulled together to shut down most air travel, most borders, and many businesses. It was truly amazing to me that we were able to do this without too much resistance. I know that this has been contentious since then, but in March and April, the world amazed me. Much like I was impressed by efforts like this one. The post Daily Coping 3 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 587

article-image-migrating-ssis-to-azure-an-overview-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
3 min read
Save for later

Migrating SSIS to Azure – an Overview from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
3 min read
For quite some time now, there’s been the possibility to lift-and-shift your on-premises SSIS project to Azure Data Factory. There, they run in an Integration Runtime, a cluster of virtual machines that will execute your SSIS packages. In the beginning, you only had the option to use the project deployment model and host your SSIS catalog in either an Azure SQL DB, or in a SQL Server Managed Instance. But over time, features were added and now the package deployment model has been supported for quite some time as well. Even more, the “legacy SSIS package store” is also supported. For those who still remember this, it’s the SSIS service where you can log into with SSMS and see which packages are stored in the service (either the file system or the MSDB database) and which are currently running. The following Microsoft blog post gives a good overview of the journey that was made, and it’s a definite must-read for anyone who wishes to migrate their SSIS solution: Blast to The Future: Accelerating Legacy SSIS Migrations into Azure Data Factory. Something that surprised me in this blog post was this: “…our on-premises telemetry shows that SSIS instances with Package Deployment Model continue to outnumber those with Project Deployment Model by two to one” Microsoft (Sandy Winarko, Product Manager) This means a lot of customers still use the package deployment model. Many presentations at conferences about SSIS (even mine) are always geared towards the project deployment model. This is something I will need to take into account next time I present about SSIS. Anyway, I have done some fair amount of writing on the Azure-SSIS IR: Configure an Azure SQL Server Integration Services Integration Runtime Automate the Azure-SSIS Integration Runtime Start-up and Shutdown – Part 1 Azure-SSIS Integration Runtime Start-up and Shutdown with Webhooks – Part 2 Connect to On-premises Data in Azure Data Factory with the Self-hosted Integration Runtime – Part 1 Connect to On-premises Data in Azure Data Factory with the Self-hosted Integration Runtime – Part 2 Customized Setup for the Azure-SSIS Integration Runtime Execute SSIS Package in Azure with DTEXEC Utility Executing Integration Services Packages in the Azure-SSIS Integration Runtime Parallel package execution in Azure-SSIS Runtime SSIS Catalog Maintenance in the Azure Cloud Migrate a Package Deployment Integration Services Project to Azure Using Files Stored in Azure File Services with Integration Services – Part 1 Using Files Stored in Azure File Services with SQL Server Integration Services – Part 2 Azure-Enabled Integration Services Projects in Visual Studio The post Migrating SSIS to Azure - an Overview first appeared on Under the kover of business intelligence. The post Migrating SSIS to Azure – an Overview appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 718

article-image-azure-databricks-adding-libraries-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
1 min read
Save for later

Azure Databricks – Adding Libraries from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
1 min read
It is a really common requirement to add specific libraries to databricks. Libraries can be written in Python, Java, Scala, and R. You can upload Java, Scala, and Python libraries and point to external packages in PyPI, Maven, and CRAN … Continue reading ? The post Azure Databricks – Adding Libraries appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 664
article-image-protect-your-sql-server-from-ransomware-backup-your-service-master-key-and-more-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
4 min read
Save for later

Protect your SQL Server from Ransomware: Backup Your Service Master Key and More from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
4 min read
Not many disaster recovery or SQL migration/upgrade scenarios require the SQL Server instance service master key to be restored.  Some do. Recently, by far the most frequent and common disaster recovery scenario for my clients has been the need for a complete, bare-metal rebuild or restore of the master database. Not hardware failure but ransomware (crypto-locking file) attacks have been the cause.  You should consider complimentary backup solutions that backup/snapshot the entire server (or VM) for SQL Server, but sometimes these technologies are limited or have too much of an impact on the server. A whole VM snapshot for example that is reliant on VSS could incur an unacceptable long IO stun duration when it occurs.  Regardless, in all cases, SQL Server backups of each database should be taken regularly. This is a conversation for another blog post but a typical pattern is weekly full backups, nightly differential backups, and in the case of databases not in SIMPLE recovery model, 15 minute transaction log backups. In the case of SQL Servers needing a rebuild from nothing but SQL Server backups, some of the key pieces of information from this checklist will be helpful: 1. The exact SQL Server version of each instance to recover, so that you can restore system databases and settings. Storing the output of the @@Version global variable is helpful. Store this in server documentation. 2. The volume letters and paths of SQL Server data and log files would be helpful too. Output from the system view sys.master_files is helpful so that you can recreate the volumes. Store this in server documentation. 3. The service master key backup file and its password is needed to restore certain items in the master database like linked server information. Though the master database can be restored without restoring the service master key, some encrypted information will be unavailable and will need to be recreated. This is very easy to do, but the catch is making sure that the backup file created and its password are stored security in an enterprise security vault software. There are many options out there for something like this, I won't list any vendors, but you should be able to store both strings and small files securely, with metadata, and with enterprise security around it, like multi-factor authentication. BACKUP SERVICE MASTER KEY --not actually important for TDE, but important overall and should be backed up regardless.TO FILE = 'E:Program FilesMicrosoft SQL ServerMSSQL14.SQL2K17MSSQLdataInstanceNameHere_SQLServiceMasterKey_20120314.snk' ENCRYPTION BY PASSWORD = 'complexpasswordhere'; 4. In the event they are present, database master key files. Here's an easy script to create backups of each database's symmetric master key, if it exists. Other keys in the database should be backed up as well, upon creation, and stored in your enterprise security vault. exec sp_msforeachdb 'use [?];if exists(select * from sys.symmetric_keys )beginselect ''Database key(s) found in [?]''select ''USE [?];''select ''OPEN MASTER KEY DECRYPTION BY PASSWORD = ''''passwordhere''''; BACKUP MASTER KEY TO FILE = ''''c:temp?_''+name+''_20200131.snk'''' ENCRYPTION BY PASSWORD = ''''passwordhere'''';GO ''from sys.symmetric_keys;END' 5. Transparent Data Encryption (TDE) certificates, keys and passwords. You should have set this up upon creation, backed up and stored them in your enterprise security vault. For example: BACKUP CERTIFICATE TDECert_enctestTO FILE = 'E:Program FilesMicrosoft SQL ServerMSSQL14.SQL2K17MSSQLdataTestingTDEcert.cer' WITH PRIVATE KEY ( FILE = 'E:Program FilesMicrosoft SQL ServerMSSQL14.SQL2K17MSSQLdataTestingTDEcert.key' , --This is a new key file for the cert backup, NOT the same as the key for the database MASTER KEY ENCRYPTION BY PASSWORD = '$12345testpassword123' ); --This password is for the cert backup's key file. 6. Shared Access Signature certificates, in the cases where your SQL Server has been configured to use a SAS certificate to, for example, send backups directly to Azure Blob Storage via the Backup to URL feature. You should save the script used to create the SAS certificate when it is created, and store it in your enterprise security vault. 7. Integration Services SSISDB database password for the SSIS Catalog. You created this password when you created the SSISDB catalog, and stored in your enterprise security vault. You can always try to open the key to test whether or not your records are correct:  OPEN MASTER KEYDECRYPTION BY PASSWORD = N'[old_password]'; --Password used when creatingSSISDB More information here on restoring the SSISDB key: https://techcommunity.microsoft.com/t5/sql-server-integration-services/ssis-catalog-backup-and-restore/ba-p/388058 8. Reporting Services (SSRS) encryption key and password. Backup and restore this key using the Reporting Service Configuration Manager, and store them your enterprise security vault. In the comments: what other steps have you taken to prevent or recover a SQL Server from a ransomware attack? The post Protect your SQL Server from Ransomware: Backup Your Service Master Key and More appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1239

article-image-daily-coping-2-nov-2020-from-blog-posts-sqlservercentral
Anonymous
02 Nov 2020
2 min read
Save for later

Daily Coping 2 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
02 Nov 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to start the week by writing down your top priorities and plans. It’s the start of a new month, and a new week. Let’s get organized. Top priorities and plans: DPS Presentations Gym Time Coaching Decisions Database Weekly These are the big items. The PASS Summit is next week, but two recordings are in, and the one I’m  going live I’ll practice again next week. For this week, I need to get things prepped for the Data Platform Summit, where I have 2 talks I need to record in the next couple weeks. After that, I missed a week of the gym, feeling ill. I got back to things last week, but I want to continue to be on track here and take care of my body. My volleyball team has been playing 15s and 16s tournaments, but it’s tough, so need to decide on a direction and move forward with that. It’s Database Weekly week for me, so I need to keep working on that each day for a bit. The post Daily Coping 2 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 696

article-image-database-fundamentals-29-create-foreign-keys-with-table-designer-from-blog-posts-sqlservercentral
Anonymous
02 Nov 2020
1 min read
Save for later

Database Fundamentals #29: Create Foreign Keys With Table Designer from Blog Posts - SQLServerCentral

Anonymous
02 Nov 2020
1 min read
The purpose of a foreign key is to ensure data integrity by making sure that data added to a child table actually exists in the parent table and preventing data from being removed in the parent table if it’s in the child table. The rules for these relationships are not terribly complex: The columns in […] The post Database Fundamentals #29: Create Foreign Keys With Table Designer appeared first on Grant Fritchey. The post Database Fundamentals #29: Create Foreign Keys With Table Designer appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 620
article-image-azure-devops-using-variable-groups-from-blog-posts-sqlservercentral
Anonymous
02 Nov 2020
2 min read
Save for later

Azure DevOps–Using Variable Groups from Blog Posts - SQLServerCentral

Anonymous
02 Nov 2020
2 min read
I was in a webinar recently and saw a note about variable groups. That looked interesting, as I’ve started to find that I may have lots of variables in some pipelines, and I thought this would keep me organized. However, these are better than that. When I go to the variable screen for a pipeline, I see this my variables, but on the left side, I see “Variable groups”. If I click this, I see some info.: The top link takes me to a doc page, where I see this sentence: “Use a variable group to store values that you want to control and make available across multiple pipelines.” Now that is interesting. I have been thinking about different pipelines, so having variables that work across them is good. To create a variable group, I need to go to the Library, which is another menu item under the Pipelines area. I get a list of groups, of which I have none right now. When I click the blue button, I get a form with the group properties, and then a variables section below. I add a couple variables and add some group info. In this case, I want some secret values that are useful across different pipelines. You do need to click Save at the end of this. In my pipeline, I see I have some variables. On the left, again, is a Variable groups item. When I click that, I don’t see any, but I haven’t linked any. Here I need to link my group. When I click this, I get a blade on the right. I can then see my group(s), and I can set a scope. I do need to click the group and then I can click link at the bottom. Now I see the variables available in my pipeline. That’s pretty cool, especially as I am starting to see separate pipelines for different downstream environments becoming more popular. The post Azure DevOps–Using Variable Groups appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 902

article-image-azure-sql-database-administration-tips-and-hints-exam-dp-300-from-blog-posts-sqlservercentral
Anonymous
31 Oct 2020
1 min read
Save for later

Azure SQL Database administration Tips and Hints Exam (DP-300) from Blog Posts - SQLServerCentral

Anonymous
31 Oct 2020
1 min read
Finally, I got my certification Azure Database administrator Associate for Exam (DP-300) after two times failure, during the journey of study I watched many courses, videos, and articles, and this post of today is for spreading what I have from the knowledge and what I learned during the journey, and I do two things during … Continue reading Azure SQL Database administration Tips and Hints Exam (DP-300) The post Azure SQL Database administration Tips and Hints Exam (DP-300) appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 941