Talks on Managing Data and Applications Anywhere
January 14th, 2020. What does this date mean for your organization? More so than with SQL Server, falling out of support for Windows Server will have immediate consequences to the security of your systems because of the frequency and volume of vulnerabilities and their subsequent patches. Click on the diagram to open an interactive Power BI report. January 14th, 2020 was the end of support for Windows Server 2008 and 2008 R2. And if this January is anything like every […]
One of the customers contacted DB Best to help address performance issues with their SQL Server application. They started experiencing problems after upgrading from SQL Server 2008 R2 up to SQL Server 2016. However, after we analyzed their database system, we discovered a complex issue with updating SQL Server statistics.
We utilized industry best practices and internal expertise to correct issues. Read our blog post to learn how we approached this complex task.
Our customer’s data science team needed to perform read-only data warehouse queries against a mission-critical online transaction and batch processing database. They were performing read-only queries over 4 terabytes of historical data. These queries are ad-hoc and look at multiple months of historical data for analysis. The queries would consume all available I/O on the production databases environment and impede the performance of the primary workload. This blog talks about how we helped them solve this problem.
Recently, the DB Best team migrated a customer’s database from Apache Cassandra to Amazon DynamoDB. As part of this project, we needed to run multiple tests to ensure perfect quality for our delivery. However, we faced a problem in setting up the test environment. Even though we were using AWS CloudFormation templates for the EC2 instances, it was taking over 30 mins to get the dev/test environment setup. In addition, our developers couldn’t run parallel regression testing. So, we decided to leverage Docker to automate the environment set up.
In this blog post, we will share our experience with using Docker for automating a NoSQL database migration dev/test environment.
One of our customers had their Management Data Warehouse (MDW) running on their production SQL Server 2012 and didn’t know what to do with it. Microsoft had introduced the MDW with SQL Server in 2008. However, there was a bug in SQL Server 2012 that prevented the Query Statistics data collector from working. Rather than trying to fix it, we made the decision to remove it instead.
Removing the MDW from SQL Server running in production is straight forward and requires that you follow the steps described in this blog post.
What do you do when you’re one of the global leaders in the beauty industry and your growing data environment is getting out of control? You set up a call with DB Best and start to speak with one of our dedicated Project Managers. We began with just a single developer working with this customer but very soon it grew to include an additional 13. After 5 years, what started as a data environment analysis has grown to a full […]
One of the world’s largest multimedia content providers was looking for a way to increase the level of user satisfaction by improving the performance of their enormously huge database system. They wanted to leverage operational best practices to increase resiliency/uptime, performance, and scalability at the lowest possible cost. DB Best provided the customer with the day-to-day operational data management services both on-premises and in the cloud, as well as configured the new architecture for their cloud-based system. Discover the benefits […]
One of our public sector customers was using a distributed SQL Server 2016 database system and was considering an upgrade to unite the servers in one data center, or even a move to the cloud. We helped them by using our DBMSys Platform to perform the overall database health check. As part of the database health check, we found several high priority issues which required immediate actions. In addition, we provided the customer with detailed information on their existing database […]
Amazon introduced the data extraction agents in the previous release of AWS Schema Conversion Tool (SCT). They allowed for migrating all the data from your on-premises data warehouse platforms to Amazon Redshift. Wait, what if you don’t need to move all those terabytes of data, and just want to settle for the data generated in the past couple of years? Starting from the version 1.0.603, SCT allows filtering the data before uploading it to Redshift. In addition, the latest SCT […]