T-SQL Tuesday #173: Analyze Query Plans With ChatRTX

This month’s T-SQL topic is being hosted by Pinal Dave. Thanks Pinal!

He asks the question, “How AI has helped you in your role as a SQL Server professional?”

Forward Looking Statement

I have NOT fully tested what I am about to share but I think it is important based on my preliminary tests and wanted to use this post as a way to get the word out about it.

Teeing Things Up

First off, analyzing query plans can take time. One of the best tools to assist is with the free tool known as Plan Explorer. There are many courses and presentations about using it. I know many of us SQL Server professionals are starting to learn PostgreSQL but I am not aware of a similar tool that can be used like Plan Explorer. Maybe there is?

Looking at the raw XML in a .sqlplan is not fun, at least not for me. It does contain a lot of properties and values that are important for query tuning.

Security, Security, Security

Due to AI company policies, you might not be able to share .sqlplan files or upload them to any websites for better analysis. So even trying to use tools like ChatGPT might not even be an option.

Enter Stage Left: NVIDIA ChatRTX

What if there was a way to analyze .sqlplan files using a local chatbot and NOT require a connection to the internet? Hm…

ChatRTX is now in beta!!! It is a single package (quite large) that allows you to run a Large Language Model locally. Do some searching on YouTube on ChatRTX for some videos and installation guidance.

Personally I was able to successfully install it on both my Win11 gaming laptop and my Win10 desktop rig. I might do a series of blog posts to get into more details. There are restrictions! It took more than a few hours to get everything installed- do the video drivers first! This only works on NVIDIA RTX cards, sorry AMD fans.

My laptop was only able to install the Mistral 7B but my desktop was able to install it and the Llama-2 13B model (I have an insane graphics card, an RTX3090).

For my first test, I pointed it at a folder of PDFs and let it essentially index them. This took time! And then using the ChatRTX interface I could begin prompt engineering against my own files! How cool is that?

Show Me The Money

A classic now, Show Me The Money. (NSFW)

Since .sqlplan files are nothing more than raw XML, all I had to do was to change the file extension to .txt in order for the software to work, and told it to re-scan the folder path.

It was a simple query from the World Wide Importers database. I then asked ChatRTX, “what are the wait stats in 09Apr24.txt” and it came back with:

Based on the given context information, the wait statistics in 09Apr24.txt are:

  • Wait type: “A” (ASYNC_NETWORK_IO)
  • Wait time: 153
  • Wait count: 31
  • Elapsed time: 169

Note that these wait statistics are based on the information provided in the given context information and may not reflect the actual wait statistics of the system.
Reference files:
09Apr24.txt

How about that! Imagine being able to ask highly specific performance tuning questions across many files in a secure fashion. I’m just beginning to explore the possibilities.

If you would like to learn more, check out the MEAP book from Chrissy LeMaire, Generative AI for the IT Pro (thanks Kevin Hill for the heads-up). Thanks for reading!

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #164: ChatGPT For The Win (FTW) and Application Programming Interfaces (APIs)

Introduction

This month’s T-SQL Tuesday topic is being hosting by none other than Erik Darling. Thanks Erik!

The question Erik asks is, “Code That Made You Feel A Way” and “it can’t be your own code.”

I’ve been working with SQL Server mainly as a DBA since the last millennium and I’ve seen a lot of really strange stuff (to echo my inner Han Solo). But I’ve never been so excited to see what AI can do. Have you heard about this thing called the Internet? I think it is going to be really big someday…

Roll the clock forward to 2023 since 1999 and now ChatGPT Enters Stage Left.

If you would like a semi-technical and short and free introduction to ChatGPT, I would highly recommend the following course: ChatGPT Prompt Engineering for Developers.

Slight Detour

While I know the majority of the readers of T-SQL Tuesday are expecting SQL Server stuff, I have A LOT of technical interests besides SQL Server. I’m into Virtual Reality, Unreal Engine, GPU Data Science, appropriate technologies and other things that strike my fancy.

My last blog post was in August of 2022. This is July 2023. My current full-time job as a Sr. DBA at Sierra Space has been exciting and consuming at the same time. We are on a mission to do things in Space to improve life on Earth. Super short version: I want to become an Astronaut Data Scientist! Perhaps (!) more on that in later posts or a series. I am currently enrolled in a Masters of Data Science and expect to graduate in December of 2025; I study and exercise and train like my life and others depend on it so I don’t have much free time between work, study, sleep and the gym. Will I ever make it to space? Who knows but I want to do those things that are in my power to make it so. In the years to come, we are not going to be training tens, or hundreds of astronauts from many nations. We are going to be training thousands! Let that sink in. The Orbital Age is upon us. We tend to make time for the things that are important to us. I’m not getting any younger… So this post is about using ChatGPT for one specific area of interest.

What is This Thing Called Blender?

Blender is a free digital content creation program. Please support open-source; I have a Blender Studio subscription and I love it.

I use Blender to create content for the purpose of importing my creations into the video game engine from Epic Games called Unreal Engine.

Blender like many programs nowadays have a well-documented API. And! It has been indexed/crawled by ChatGPT.

Show Me The Money or Code That Is Not My Own

The very first time I tried out ChatGPT I was NOT that impressed. The results were kinda wonky, I saw what others had tried in the SQL community to write SQL both DDL and DML and I was less than impressed.

AFTER I took the course mentioned above I gave it a second chance. And a third chance. And other chances. And then I was hooked once I understood the core caveat: when it comes to AI and to paraphrase from Shakespeare, brevity is NOT the soul of wit. To use AI and get better results, imagine AI as a wicked smart friend who doesn’t have a clue on what you do for a living. Explain things, step by step. Paste in its own errors. Iterate!

When I did that and saw the results, I had a nerdgasm. OMG! This is so frickin’ cool!!!

I, for one, welcome our new AI Overlords. Like the people from OSHA – “We’re here to help.”

Heh, if I tell a Dad joke and the Wifi AI toaster laughs, I’m going to blast it. I have my limits. Others need to think before blindly embracing new technologies just because it is cool and everybody else is doing it and it is affordable, just sign your rights away in this EULA to use our product/service blah blah blah. Buyer beware!

Reproducing Others Results

I read this article and followed along. Here is the video I also did step by step.

The original author’s video was published in January 2023 so ChatGPT performance has changed since then; I’m using Model: Default (GPT-3.5) and ChatGPT Plus ($20/mo). GPT-4 is now available for Plus subscribers too.

The code it generated for me was different than what I saw in the video. My code was already compact but it didn’t use classes like it did in the video. Since I already know Python and have played around with the Blender API before, I could see where some problems can come up. My last prompt “refactor and use cube as a class” did finally make my code look more like the author’s.

While the video was aimed at no-code and just copy and paste from ChatGPT, I found this as an excellent way to get to know 1) how the Blender API works and 2) some unique behaviors of how the Blender User Interface (UI) works (Ex: having to switch between Edit and Select programmatically).

A skilled Blender user could do all of these things pretty quickly w/o having to fiddle with the API but that was the main point of the video- being able to use ChatGPT to help you. One still needs to understand a bit of programming and how to ask intelligent questions. I didn’t know you could just paste in error messages too and ChatGPT can try to figure it out and try again.

Here is both the scripting code and final product from the last prompt from ChatGPT for me.

Probably not that exciting to most people but storing API source code is so much easier to deal with in terms of space and readability instead of binary objects. Again, all I had to do was copy and paste in the final code ChatGPT generated exactly what was asked and save it.

Outside of Blender I have had other amazing experiences with ChatGPT like generating dbatools PowerShell code, RAPIDS, Scripting for Unreal Engine and others.

Conclusion

In the hands of an experienced professional I think using ChatGPT and interfacing with API(s) is a fantastic time-saver and can increase the productivity of so many people. Have to be mindful of intellectual property, trade secrets, etc. – don’t be dumb or put either your company or yourself at risk with AI. Educating your new smart friend has its’ own strengths and weaknesses- just don’t be afraid to do the right thing. The genie has left the bottle and ChatGPT is here to stay. Lean into it and take the time to learn something new even if it is scary at first. Thanks! –Todd

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #153: Looking Back at PASS 2011 – MVP Authors

This month @kekline is hosting T-SQL Tuesday and is asking the question:

Tell us the story of how attending an IT conference or event resulted in an amazing career or life opportunity.

The year was 2011 in the fall at the annual PASS conference. I had just recently started a new job and one of the requirements I was adamant about as a hiring requirement was going to the PASS Summit. For many people, they wouldn’t risk loosing a job opportunity for something like that but to be honest, who is in charge of your career? You are.

So I went to Summit to learn and hang out with friends and acquaintances.

Why am I nostalgic about 2011? As I’ve gotten older (I just celebrated the 26th anniversary of my 29th birthday) memories and looking back are becoming more and more important to me. It isn’t so much as what happened at Summit in 2011 as to watching the #SQLFamily grow and adapt and overcome over time.

Swag Still In Use

Tell me- of ALL of the paid swag or free gifts you have received at conferences, how much of that is still in use today? Check this out:

PASS 2011 Backpack and SQL Server MVP Deep Dives Volume2

I tend to want to be a good steward of things that have been entrusted to me. Read- I take care of things and try to make them last. Some of you who have been to the Denver SQL User Group over the years have seen me with this backpack.

While in the past I have blogged about my extensive SQL Server book collection and I love collecting signatures when I can, this particular book stands out.

SQL Server MVP Deep Dives Volume 2

Why? Because they had a book signing by all of the MVP(s) who could attend the book signing line. I have 35 signatures. Most signed their bio page.

Some of these authors have:

  • Moved on into other careers and are no longer active in the SQL community
  • Passed away 😦
  • Retired
  • Switched jobs many times up to today
  • Moved up in the world and are doing quite well for themselves
  • Gotten older and look nothing like their bio picture today (ha!)

My point is this: the #SQLFamily is just that – family. Families change over time!

And! Can you guess out of the 65 articles written in 2011, how many of them were about the cloud?

Big Fat Zero.

And! Can you guess how many of the technologies presented have been super-ceded by current tech and software and are no longer relevant or are now just seen as poor advice?

Several.

My point is this: remember what I said above about who is in charge of your career? You are. It is up to you to control and deep dive into things you are passionate about and can share with others for constructive feedback.

Never Been an MVP

Want to know something? I have never been an MVP. Been using SQL Server since the last millennium- one of my go to Dad jokes I store in my Dad-abase (!)

Have been an active member in my local server group for almost a decade, hosted several SQL Saturday(s), travel to speak, etc. but just never qualified for what seemed like ever changing requirements of the MVP program; how much do you blog, how many Microsoft licenses and revenue can you bring in, how many people do you know like a popularity contest, how many and what types of certifications do you have- and I have heard so many explanations over the years at to why some people get picked and some people do not. So I stopped worrying about it as a career goal long ago and just try to do the best job for my current employer as I can. Over the years I have learned to stop comparing myself to other people and to just focus on doing the best that I can for my Lord, my family, myself, my country and my community- including all of you!

Over time it has been a joy to watch people advance their careers and grow their families in whatever form that looks like. The last eleven years has seen such an incredible amount of changes since Vol 2 of the Deep Dives came out that it really is a snapshot in time. Will there be a Vol 3 someday???

Why Such A Long Hiatus From Blogging Todd?

As I ask myself this question in the third person. Long story, will try to tell another time, maybe. On April 11th I started a fantastic new job at Sierra Space as a Senior DBA. It was like working at a start-up all over again. Mentally spent at the end of the day, constant meetings start to finish, pushing myself, excited to jump out of bed and go into work (yes- I am one of THOSE people who work best around other people; I’ve worked ~3 years WFH and I had enough). The Spring and Summer of 2022 has been a lot of long days but good days.

Slowing Down to Think

Another long story short- my car died. It has been towed three times and been into two different shops over ten times. Both they and I have replaced the computer, every single sensor and the problem appears to be some kind of intermittent problem. It can run fine for months then will barely drive at all. With the insane increase in the price of gas here in the US I decided to do something crazy. A buddy at work has one of these and with my EcoPass I can take it with me and ride free (!) on the buses or trains in the Denver Metro area. FYI: I have over 458 km/275 miles on it to date. Tires are showing some wear!

Segway Ninebot on a bike trail in Cherry Creek State Park

I’ll blog more about my scooter experiences later but it has given me more time to think as it has forced me to slow down these last few months. Especially while riding and waiting on trains and buses. In case you were wondering, my car is still in the shop too.

Conclusion

Hey Kevin! If I can get my hands on a copy of SQL in a Nutshell 4th Edition, I will make it a point to find you someday at an event and get your signature on it too. Thanks again for hosting and you too are someone whom I have admired and have respected all of these years looking back! —Todd Kleinhans

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #147: Access to SQL: My First Upgrade

T-SQL Tuesday

This month is being hosted by none other than Steve Jones. Thanks Steve! The topic is about Upgrade Strategies. My first database upgrade was about going from Microsoft Access to Microsoft SQL Server 2000. Why? “Because I needed the money.”

Scenario

I had just been laid off for the first time in my life from a dot com. I was a classic ASP web developer, a junior development DBA, and I knew Access and FileMaker Pro. Interviewed and got hired on as a contractor to help with Access and ASP.

Before me, a local consulting company was retained to help them with the migration from Access to SQL Server. It was a disaster.

The company I was contracted to help was a local home builder and they were having scaling problems with both their internal and external applications. They had just spent A LOT of money and were disappointed it didn’t work.

So I was brought on to help with several things including trying to figure out why the previous attempt had failed so badly.

My First Database Class

Beforehand in graduate school I had taken my first formal database class. It was taught by a professor who was in the midst of writing his own book…

As in we the students had to read the chapters as they were printed in a three ring binder. Database of choice for the class? Access.

I had purchased my own book and read it cover to cover and jumped into Access with both feet. At a previous job I already had some experience with Access so the class was going to be both fun and straight forward for me.

The Crime Scene

A problem commonly seen in mom and pop businesses is nepotism. The mother of one of the employees had created the original Access database from scratch and the entire business was built around it. It was never designed to run a business at scale and now the owner of the business was tired of problems and dealing with her, hence part of the desire to upgrade.

Technical details: Access is a stand-alone database with all kinds of client facing tools built into it. Write queries, create and run reports, store data- everything, all in one file.

The way it was being used was amazing to me because I was surprised it even worked at all! Imagine an Excel spreadsheet on a file server with everyone having access (ha!) to it. Employees open up their own personal local copy and it is linked to the main copy on the file server in the main office. And! The company had a classic ASP website and it too connected to the spreadsheet. Now replace the word spreadsheet with Access database and you get the picture. So it was an Access to Access set-up and also Access to a web server.

There was a long list of technical problems but the squabbling was the worst. Several employees wanted the owner to just give so-and-so’s mom more time and money to fix things in Access. Others hated the system like our realtors- they felt ashamed recommending customers browse home listings using a slow web site. There is no concept of temporary tables in Access- if you need to temporarily store results, you have to save them to a table first. Views calling views calling still more views. No stored procedures. No regular index maintenance. You could run the same report and get different results. Many times the database would literally lock up and no one could do anything until we could resolve the lock file issues.

https://docs.microsoft.com/en-us/office/troubleshoot/access/lock-files-introduction

The recent failed upgrade attempt left many people angry and upset because the system was down during the failed upgrade and no one could get any work done in the office or in the field.

Agent K Assigned to the Case

I was walking into a bad situation. The IT Manager told me everything up front and since I was a contractor that if this didn’t work out, I wouldn’t be coming back. However he knew this was a technical issue and it was solvable and he had faith in me.

But could I do it?

Looking At The Evidence

I was still fairly new to T-SQL and looking at the scripts from the previous attempt was foreign to me. They were trying to run things that had nothing to do with Access. So I whipped open my Access Bible and read all that I could about connecting to SQL Server and upgrades.

Mission Impossible

It was just me. All I had was a handful of people to test whatever it is I came up with.

1) Upgrade/convert the backend Access database on the file server to SQL Server.
2) Change and test the client front-end Access (forms, reports, queries) to use SQL Server.
3) Change and test the classic ASP code to use SQL Server.
4) Do a good job and you’re hired. Do badly, and there is the front door.

My First Attempt: Failed But Saw Clues

I worked on #1 and it did not work. The database itself upgraded just fine to SQL Server but the Access application wouldn’t work correctly and web site returned all kinds of strange errors. I tried not to keep looking at the front door. Rolled back the deployment. My only saving grace was my experience with Access. It handled dates differently than SQL Server and I could see that right away- something the previous company had completely missed. Also, the previous company had neglected the data source names and drivers on machines (Start: Run: odbcad32), MDAC, and things like proper connection strings in ASP too.

My Second Attempt: Victory!

On my second attempt there were some hiccups but I finally got things working correctly. No more errors or locking. I got hired. Even when we ran into issues after the upgrade, I’d still hear, “Well, when so and so was building in Access, we never had these problems.”

Conclusion

I did not have all of the answers or technical skills going in. I had limited or no help. My back was against the wall (recently laid off, young family, mortgage, etc.) and I had to figure things out else I would be looking for another job.

The lessons I learned from that project still help me today. Treat it like a project, get an executive sponsor, know the full stack, data type mis-matches, dealing with partial information, lack of support, trusting your instincts while being open-minded and many more.

Thanks for reading!

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #143: import this

October’s T-SQL Tuesday is being hosted by John Armando McCormack – thanks John!

According to his post, we can pick any snippet or consistent goto code we reach for on a regular basis.

For a while now, I have been teaching myself and using Python more and more everyday. It has become my programming lingua franca of late. Will SQL Server Integration Services in version vNext (to be announced at the upcoming Ignite??? SQL Server 2019 is showing its age) ever natively allow Python? In Script Task Component the choices are Visual Basic or C# but no love for Python…

In the world of data science Python is all the rage. Don’t get me wrong, I love 4th generation programming languages but for me at this point of my life right now, Python is where it is at.

Heh, just like they announced availability of Visual Studio 2022 (up from 2019), I predict SQL Server 2022 will be announced at Ignite. I am also NOT under any kind of NDA with Microsoft so come at me bro!

A tool of many data scientists are Jupyter Notebooks. Type some code and run it and it can also save the results with the notebook, unlike SQL Server Management Studio.

So! My favorite daily reminder in my notebooks is -> import this <-

Python has a beauty in and of its own. Run ONE line of code and you can see The Zen of Python. Awesome!

A book published by our friends over at RedGate many years ago by Phil Factor was called SQL Code Smells. I wish many of these could be made into The Zen of SQL. If you are new to SQL, please consider adding this to your arsenal and don’t forget to click on the code examples too. You might not understand everything now, but in due time you will see many of these gems and say, “Ah-ha! I remember reading that somewhere!”

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #141: Checking Out of Reality Without Harm

TJay Belt topic for this month’s is Work/Life Balance and he asks a series of questions. I am going to go with, “What are tips or tricks can you share to help others?”

The most widely used virtual reality headset that is not connected to a gaming console is the Oculus Quest. On August 24, 2021 Oculus will make available a 128GB Quest 2 for $299. It will include the new silicon cover- no more sweaty foam and makes it much easier to clean and wipe down. I have a Quest 1 and have been patiently waiting until I dive in and get a fully-loaded Quest 2.

Besides being able to play LOTS of great games, people are getting into VR fitness. Yes- being able to work out and get sweaty with this contraption on your face.

The headset works wirelessly standalone and you can still connect it to your PC (wired and wireless – you MUST be on the same wireless network- it needs to be fast on order to connect to your PC).

I still get up in the morning, drink coffee, and draw in VR. Drawing around you with digital paint, sculpting with digital clay, take your pick.

My initial interest, and it still is, are art galleries. This link will show a demo. Those are personal photographs and they are 6 feet tall. You can walk up and see details that you simply cannot see just looking at images on a computer screen.

For those #SQLFamily members who own a Quest 1 or a Quest 2, and are interested, I’ll offer to build you a free art gallery. Hit me up on Twitter and we can chat.

For a drug and alcohol free way of checking out of reality and being able to do many things in a healthy and fun way using a bit of affordable technology has helped me cope and deal with life lately.

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #138: Only the Plugged-in Survive

T-SQL Tuesday Logo

This month’s T-SQL Tuesday topic is: Managing Technology Changes hosted by my brother from another mother, Andy Leonard. Thanks Andy!

Premise

“How Do You Respond When Technology Changes Under You?”

As I like to say, I’ve been using SQL Server since the last millennium and I’ve seen a lot of things come and go (to echo my inner Han Solo).

While the moment you think you are an expert and there is nothing else left to learn, that’s when you are in trouble. There is ALWAYS something new to learn as one perfects one’s craft.

Only the Paranoid Survive

Years ago I read this book by Andy Grove, one of the founders of Intel called Only the Paranoid Survive. It can be summed up by looking both inside and outside of your organization and never settling for thinking you know everything about everything. It was great advice then as it is still now. Sadly the executives at Intel, IMHO, got “Fat, Dumb, and Happy”- a colloquialism meaning they allowed themselves to get comfortable and took their focus off things. Today AMD is eating their lunch in the processor markets- how did they let this happen when they used to be number one by a large margin?

Before that book came out and after I had graduated from college and did a brief stint at Radio Shack, I worked at a market research company. They specialized in competitive intelligence in the energy industry. We stored our primary research in you guessed it, a database system. This was my first exposure to using SQL on a green screen. The datacenter contained a system by Sequent Computer Systems.

Only the Plugged-in Survive

Today we don’t use paper-based media monitoring services anymore- everything has gone digital. But it is still “catch as catch can” even using things like Google Alerts. Keeping up by using social media? Good luck with that, even with aggregation tools. You could miss something if you are not looking at a computer screen. FOMO is real for the paranoid!

Let’s see; I am on and try to keep up with: LinkedIn, Discord, Slack, Reddit, Facebook, Twitter, Instagram. I tend to let serendipity and dumb luck drive some of my viewing habits. I read books (!), and go to several Meetups (100% over Zoom at the moment, ugh). Same for on-line conferences. I listen to podcasts. I subscribe to too many e-mail newsletters. I watch streaming shows. I have conversations with people. And yet I could still miss something important!

My lingua franca today is Python 3. I’m using it more and more in the things I’m interested in both personally and professionally.

I’m currently going through a book (second edition) and Udemy course called Automate the Boring Stuff with Python Programming. NOTE!!! The author has generously created a Creative Commons version (free) available on the book’s website. So why mention the book and course? Because I am learning about web scraping for personal use- please don’t break the law.

RAPIDS is Rapidly Changing

In my last post I talked about how I’m taking a new, technical, immersive, deep dive into RAPIDS. Since I wrote that, I have acquired (again through dumb luck), an NVIDIA RTX 3090 for my gaming, VR, and data science interests. This card requires three (!) power connectors. It is a beast!

Alas, even as I’m taking copious screenshots and notes using OneNote, they are releasing new features with every release- roughly every six weeks, which at times feels like that is faster than I can learn about them. And they are re-naming things too. So even as I’m documenting and coding my personal corpus on RAPIDS, I’m performing find and replace on content. On newly learned info! I also have to change some of my mnemonics to reflect new changes too which is frustrating to me.

Even if I get the chance to write a book about it, like Andy, imagine the overwhelming feeling of writing something that parts could be obsolete by the time it is published! Ah, the joys of being a technical writer 🙂

Learn and try to stay plugged-in. Talk to people. Ask questions on forums.

Get Outside

People, there is life outside of an electronic screen. Just do the best you can and don’t sweat it. The chances that I’m going to be blindsided by some new hotness is slim to none. Even if I’m not the first to know, I’m sure that if its important, somebody in my social circles will hear about it.

Posted in Uncategorized | Leave a comment

T-SQL Tuesday #137: The Best of Both Worlds – RAPIDS and SQL Server

During Lent 2021, I stepped away from writing and social media. Winter is over and I’m looking forward to a lot of writing in the months ahead.

This month’s T-SQL topic by Steve Jones is on using Jupyter Notebooks. I first saw their use for the first time at the PASS conference in 2019 (remember PASS?). What I thought was cool was the ability to both run code and save results too. Even have text, images, etc- just like a paper notebook.

I’ve been a SQL Server DBA for most of my data career. Some of you have wondered what the future looks like for DBA(s) and something I’ve wanted to have a reason to get into is data science. I don’t remember exactly how I first heard about RAPIDS from NVIDIA, it was probably something I stumbled across from following them on Twitter.

NOTE!!! Don’t miss the free conference starting on 12Apr21 all week, they will have a lot of new announcements and plenty of free training resources too: GTC.

What Is RAPIDS?

From their website, “The RAPIDS suite of open source software libraries and APIs gives you the ability to execute end-to-end data science and analytics pipelines entirely on GPUs. Licensed under Apache 2.0, RAPIDS is incubated by NVIDIA® based on extensive hardware and data science experience. RAPIDS utilizes NVIDIA CUDA® primitives for low-level compute optimization, and exposes GPU parallelism and high-bandwidth memory speed through user-friendly Python interfaces.”

The teams at NVIDIA have been working really hard to try to maintain a familiar Python syntax for existing libraries but have them run on the GPU(s). Here is their release roadmap; version 0.18 is what I’m using.

GPU stands for Graphical Processing Unit. Anybody with an NVIDIA graphics card (Pascal or higher) can run RAPIDS. Right now, it only runs on Linux; I failed to get RAPIDS to use Windows Subsystem on Linux (WSL) as I only have a Maxwell based GPU on my gaming rig. I’m hopeful that once I get a new video card, I’ll re-visit running RAPIDS on WSL. If you are wondering, I tried to follow the steps both here and here.

Here is a video talking about how Wal-Mart is using RAPIDS.

Dual Boot Using An External SSD Drive

It just so happens I received a new laptop for work and it has a Pascal GPU! But alas, this is a work system so I can’t install Windows Insider Preview versions. I also didn’t want to risk fiddling with partitioning the internal drives either.

Low and behold! I stumbling across this video trying to research installing Ubuntu on an external drive. Since I had all of the parts, I didn’t have to buy anything. NOTE!!! The video is over a year old and some of the steps have changed. Just be sure to stick with Ubuntu 18.04; don’t upgrade to 20.04.

I have been running this for a few weeks now without any problems. My work laptop is unaffected and I can learn RAPIDS using this configuration albeit with a limited amount of GPU memory.

Installing SQL Server on Linux

Sticking with the instructions for Ubuntu 18.04, I installed SQL Server on Linux. I also got Azure Data Studio for Linux installed too. Downloaded and restored the AdventureWorks databases. I then ran through several steps to get the Python drivers installed. This site was very helpful (remember to stick to the 18.04 instructions) and I was able to get things running from within a Jupyter notebook. Finally!

Running Jupyter Notebooks Using Docker and RAPIDS Container

Click on the link [http://localhost:8888] in a browser like Firefox and the notebook will open.

Since I’m focused on leveraging existing data skills to learn GPU things, I was interested in things that had a SQL syntax. Note there are several notebooks covering many of the tools within RAPIDS which can be found in the container.

(caveat: the following examples are from https://app.blazingsql.com which will be shutting down and will be re-branded as something else? Time will tell.)

What follows are several screenshots from me copying code from their site and running it locally on my machine. These examples use NY taxi data (46MB).

What is so neat and cool to me is to be able to run queries and plot data without a whole lot of headache or software installation.

SQL Server -> Pandas -> cuDF

The component known as BlazingSQL is an in-memory data analytics tool which uses the CUDA dataframe (cuDF)- it doesn’t persist data. By using SQL Server, one can use all of the tools many of us are familiar with. The following screenshots show how I connect to SQL Server on Linux using Python inside of a notebook and then load a GPU dataframe, then show some metrics about it.

These are the results from running gdf.info(verbose=True) from above:

Conclusion

I know this was a whirlwind tour. Jupyter notebooks has made it easier to query data, see results, and do Pythonic stuff. RAPIDS is getting a lot of attention as it can do things much faster on the GPU than on the CPU like SQL Server. I hope to write more about RAPIDS and its use in data science.

Posted in Uncategorized | 1 Comment

T-SQL Tuesday #134: Stopping to Breathe Again and Talking Avatars

The first post of 2021 for T-SQL Tuesday is being hosted by James McGillivray and his updated post on the subject of taking a break can be found here. Thanks James!

Sadly one of the members of the #SQLFamily has passed away due to complications from COVID-19, Gareth Swanepoel, and the organization once known as PASS will be no more on 15Jan21.

I maintain #SQLMemorial and you can watch the latest tribute video to the fallen here.

Week 1: A Fresh Start

I’ve written Part 1 and Part 2 about getting into MBSR so this is more of a re-start than a Part 3 post. When the pandemic really kicked into high gear mid-March 2020, I just didn’t stick with the commitment of 40-45 minutes each day for eight weeks. A few weeks into it and I spent my time getting outside to go for walks and taking power naps instead.

2021 is a new year with new goals so I’m stopping to breathe again. Secret? Put it on your calendar throughout the day and make time. All of the PDF(s) and audio files for the MBSR course I’m going through can be found here.

The Good, The Bad, and The Super Scary (Or Not)

I have about 100 pages left of a book I’m reading, VRx: How Virtual Therapeutics Will Revolutionize Medicine by Dr. Brennan Spiegel.

It is a fascinating book and I’m really enjoying it. In one of the chapters, he lays out a exploration into the treatment of augmenting an existing therapy using VR for people with schizophrenia which grabbed my attention.

Back in March 2019, I wrote a little about digital avatars and the incredible progress in creating live-like avatars but using something other than a human. T-SQL Tuesday #112 “A New Cookie Jar”.

Now this technology is becoming more accessible. Take a look at the beta version of Audio2Face video created as part of NVIDIA’s Omniverse suite of software. Please note that it is only using an audio file- no programming by the end user was required.

Back to VRx. Imagine someone being bullied by their inner demon, a satanic figure with two ram-like horns, blood-red skin, massive bat wings, and talking to you a short distance from your face. “You are no good. You are a bad father, You are a bad husband. You are worthless!” Chiding, insulting, distracting, incessantly.

Disturbing visual and audio to say the least. But that is what a psychiatrist has been able to do- create an avatar as close to his patient’s hallucination but using the psychiatrist’s voice. And over the course of treatments, the audio becomes, “You’re not as bad as I thought. I underestimated you.”

So, let’s look at the flip side for positive mental health applications. Imagine someone or something saying positive, uplifting and affirming quotes and sayings. How would that make you feel? Better than trying to talk to oneself in a mirror I suppose.

While positive psychology entails more than just technologies, it could be interesting times ahead for positive avatars helping and encouraging people.

One More Thing

I know several folks who are in the #SQLFamily and are into video games, myself included. Epic Games releases a FREE game every week from their on-line store.

On 14Jan21 for all of you Star Wars fans, Star Wars: Battlefront 2: Celebration Edition will be released and free for a week. Once you claim it, it is yours. I have over 120 games I have downloaded from the store over the past two years- all for free.

Posted in Uncategorized | Leave a comment

The 2020 Fall Unreal Engine Learning Challenge – Lessons Learned

Introduction

I have been into Unreal Engine as a hobby for a while, and I’m not a game developer by trade or by education. When learn.unrealengine.com came out, I was excited to take courses.

On November 18, 2020, they announced the 2020 Fall Unreal Online Learning Challenge and I decided to jump in. Truth be told, I had already taken the Build a Detective Office Game Environment course but I took it again anyway as a refresher.

I hope this Lessons Learned post will inspire people to jump into Unreal Engine now and in the future as it is a fantastic game engine but thanks to this FREE learning portal, it is not as intimidating to learn as it once was. As the saying goes, if I can do it, anyone can do it!

About Me

I have been working with databases as a full-time job for over 20 years and currently work for a non-profit. I help run the Denver/Boulder Unreal Engine Group and thought this would be something fun to do- to lead by example and dive right in. Someday I really want to get back to meeting in person (I’m in Zoom calls all day long). I use UE4 for a lot of personal projects and am looking to do even more with it in 2021. Stay tuned!

Attack Plan

In order to take all five courses in the time allotted, I really needed to focus and set aside other tasks so I could concentrate and complete on-time. My methods are as follows; they may or may not work for you but they work well for me.

I use OneNote for screenshots of slides presented in addition to stopping the videos and taking notes.

The authors of these courses are very knowledgeable and I don’t want to miss anything.

Second, and this is extremely time-consuming, I watch ALL of the videos, straight through, and only stopping and taking notes if it rings faint bells and I need to research something further. Meaning, I just want to “see” the whole thing in a short period of time to get a feel for what I’m getting myself into.

Third, I re-watch them, going slow and taking notes as I go along and working through all of the examples while I have UE4 open and following step by step. If one were looking over my shoulder, one would see me using two screens, OneNote, UE4, SnippingTool on one, and the course itself on the other. I’m constantly cycling through windows using Alt+tab as I’m going through the material.

In my experience so far, I average about 2-3 hours (sometimes longer if I’m not getting or immediately understanding everything) for EVERY hour of content. So a three hour course might take me all day using this method.

One more thing which I find extremely important- REVIEW!!! If one is taking the time to laboriously take so many notes, take time to review them immediately afterwards since writing them, and glance at them before starting the next module. Try to recall (from memory) what the content was about and your own notes on them. This is hard and yes, even boring at times- new content to learn is calling!!! However if one really wants to understand the material, slow down and take time to review. It is an investment, and not a waste of time.

I do try to follow the authors on Twitter if they are on there (I’m @toddkleinhans by the way) and add any and all links to books, courses, other resources they recommend. Note that some of the courses they reference may or may not have already been published.

Other Thoughts and Observations

4.26 is the latest version as of this blog post on 13Dec20. It is the nature of software development to always be upgrading. I had to install 4.23 to complete some of these and it felt quaint 🙂 Always use the version the authors’ are using as this will help to minimize the risk of the projects files not working correctly.

I wish the courses had transcripts with them- I spent A LOT of time stopping and starting the videos over and over again just to catch and type what the author(s) were saying.

Learning about the organization and naming system for things was valuable too.

I LOVED the deep-dive into The Village for some of the courses. An exploration into the look, feel, mechanics, Blueprints and C++ code behind the game. Being able to “wrap my brain” around it was important and highly instructive to me.

Even though I have successfully completed the Challenge, I can’t wait to take other courses in 2021 too. I not only want to see so many more badges on my Dashboard, but more importantly, I want to be able to apply what I have learned. And re-visit the courses when necessary- don’t be embarrassed if you forget something or need a re-fresher!

Personal Project

After completing this Challenge, I now have the technical insights and expertise to re-visit and improve a personal project of mine.

My first blog post on it can be found here.

Some of the technical aspects can be found here. Note that I now know how NOT to do things in the Level Blueprints, I’ll be changing and improving it, based on the audio courses I just took. Heh, I did have that overwhelming feeling of, “Oh… I didn’t know that… I need to change that when I get a chance!!!”

My latest YouTube version (using OBS to screen record audio and video) here.

Conclusion

Keep taking courses, notes, and adding to one’s own toolbox. This Challenge was intense for me. It really gave me a good proverbial kick in the pants to step up and dive in while yes, at times, being COMPLETELY outside of my comfort zone but slogging through, re-watching things over and over again until I “got” it.

The Challenge opened my eyes to several new things and many ways to improve existing projects I am working on. Thanks for reading!

Posted in Uncategorized | Leave a comment