call us toll-free +1 855 855 3600
 
DB Best Chronicles
 
Talks on Big Data, Mobile Apps, Web and Software Development

Now is the time to consider augmented reality applications

Posted by | On January 16th, 2017 | In Mobile Development, Web & Software Development | Tags: , , , ,
 

Mobile applications industry delivers good news: augmented reality (AR), voice control, virtual reality (VR) and other amazing technologies become more and more popular. More importantly, modern smartphones have enough power to process the graphics for AR and VR. You can simply insert your smartphone into an affordable VR headset in order to dive into virtual reality. So, the most part of smartphone users now can access this variety of modern breathtaking technologies. Moreover, these technologies are not expensive and they do not require extra power from a mobile device.

To see how the DB Best powered AR & VR application works, please, watch the following video.

We’re confident that you’re already wondering how you can use these incredible technologies. DB Best can work with your company to create a similar application for your business in a very short time frame. Contact us now to learn how we can bring your ideas to life and implement them in an AR & VR application for your business.

Read below for more details on how this application prototype works.
Read the rest of this entry »

Portfolio spotlight: Using iBeacons in mobile application for exhibitions

Posted by | On January 4th, 2017 | In Customer Story, Mobile Development, Web & Software Development | Tags: , , ,
 

Visiting an exhibition, you may want to use a specific mobile application developed for the particular event. Especially if we’re talking about an airplane show, which tend to cover large areas. During the show, you would definitely like to know exactly where you are, what’s around you, and where else to go. Using map services to implement the previously mentioned features is a fairly simple and common task. But how do we get information about the various exhibits? Well, here we have an opportunity to integrate a truly fascinating IoT device. We used Apple’s iBeacon technology as part of our Live Airshow mobile application to enhance the attendee’s experience..

Please watch the following video to learn more about this technology in particular, and other features of the Live Airshow application.

Read the rest of this entry »

First look at AWS Schema Conversion Tool with tighter DMS integration

Posted by | On January 3rd, 2017 | In AWS, Database Migration | Tags: , , , , ,
 

On December 20 2016, Amazon released version 1.0.502 of AWS Schema Conversion Tool (SCT). SCT now provides the integration with Data Migration Service (DMS)  and allows you to create AWS DMS endpoints and tasks. In addition SCT allows you to run and monitor the DMS tasks from SCT.

AWS Database Migration Service helps you migrate databases to AWS easily and securely. The AWS Database Migration Service can migrate your data to and from most widely used commercial and open-source databases. The service supports homogenous migrations such as Oracle to Oracle, as well as heterogeneous migrations between different database platforms, such as Oracle to Amazon Aurora or Microsoft SQL Server to MySQL.

As part of DMS integration, SCT introduced support for AWS profiles where users can store credentials for Amazon S3, AWS Lambda and DMS using the AWS Schema Conversion Tool interface. Finally, SCT now supports Amazon Aurora with PostgreSQL as a target database for conversion and migration projects.

Creating profiles

In order to use Data Migration Services inside the AWS Schema Conversion Tool, you must specify the required credentials. You can store the credentials in the corresponding Service Profile, which can be set up in the general application settings. To do so, click the Settings button, choose Global Settings, and go to the AWS Service Profiles tab.

AWS SCT 10 Global Settings
Here you can add service profiles to enable access to other AWS resources (DMS, RDS, Lambda). Later, you will be able to switch between these profiles directly in the application.

AWS SCT Data Migration Integration 01

To set up the service profile, you have to specify its name, AWS Access and Secret keys, and the region. After the setup is completed, press the Save button.

AWS SCT DMS Integration 22

Using Data Migration Service

A typical database migration project usually consists of 12 basic steps. So, now the AWS Schema Conversion Tool to some extent (we mean fully or partially) covers the following steps described in our methodology:

  • Database schema conversion;
  • Application conversion / remediation;
  • Scripts conversion;
  • Data migration.

Before you can actually use the Data Migration Service, integrated into the Schema Conversion Tool, there are a few things that you need to address. First, you have to establish a connection to the source and target databases. Then you must convert schemas and apply them to the target database. This step is needed since DMS expects the tables to exist in the target database.

When this is done, you can move forward to creating a DMS task. To do so, just select the objects to migrate data from the source metadata tree and click ActionsCreate DMS task. Also, this option can be chosen from a pop-up menu after you right-click on selected objects.

AWS SCT DMS Integration 33

On the next step, you must specify the needed parameters to start the data migration process. They are the task name, the replication instance, and the source and target endpoints. Also, you can include or exclude the LOB columns, and enable logging for the operation.

AWS SCT DMS Integration 44

When everything is set and the endpoints are created, just press the Create button. You will get to the Data Migration View window.

AWS SCT DMS Integration 5

The data migration task will be put into the queue, however it should still be launched manually.

AWS SCT Data Migration Integration 06

You can track its progress in the Data Migration View or switch back to your SCT project.

AWS SCT DMS Integration 777

When the data load is completed, we use Database Compare Suite to compare the tables in source and target databases.

Database Compare Suite table comparison

As you can see, the data migration operation was completed successfully.

 

Summarizing all features implemented in the new version of the AWS SCT, we can see that the tool is not only getting new features, but it is also becoming more user-friendly and intuitive. The integration of Data Migration Service for OLTP databases into AWS Schema Conversion Tool is an expected and long-awaited update. Now you can perform and control all operations of an OLTP database migration project in one application — you don’t need to switch to another window anymore. For a complete overview of the new capability from AWS, check out Working with the AWS Database Migration Service Using the AWS Schema Conversion Tool.

Be sure to check out our new AWS Schema Conversion Tool Jumpstart offer to get you up and running fast for your AWS database migration projects.

Related AWS SCT Posts

 

Embedding Power BI Reports in a WordPress blog

Posted by | On December 26th, 2016 | In Business Intelligence, Power BI | Tags: , ,
 

Back in March of 2015, DB Best published a blog post on how to embed Power BI Power View interactive reports into your blogs and websites. Much has changed in Power BI since then to make it possible to easily embed reports into your blog articles. There are now two basic options for embedding reports. Power BI Embedded is a service on Azure for publishing Power BI visuals and uses authentication to make sure users can see the data they are suppose to. The other method is to use the new Publish to web feature introduced in the October 2016 release of Power BI. The downside is that there is no authentication used when viewing the report. For the most part, this is perfect for blog posts and social media. So you have a really cool Power BI Report that you want to share with others. You could always do a static screenshot of the dashboard like the one below.

EPB-00-Example of embedded Power BI report

Let’s see how you can add the interactive Power BI experience within a blog post.
Read the rest of this entry »

Reduce the cost of an Oracle database migration project by using SQL Server Linked Servers

Posted by | On December 21st, 2016 | In Customer Story, Database Migration, SQL Server | Tags: , , , , ,
 

When talking about migration from Oracle to SQL Server, we always look for ways to work with our customers to keep costs under control. We recently completed an Oracle to SQL Server migration proof of concept project for a government organization in South America. What made this project unique was the way we used SQL Server Migration Assistant (SSMA) to convert one of eleven Oracle database schemas that were running on a 12-node Oracle RAC cluster to SQL Server. The other ten schemas remained in place and we used SQL Server Linked Servers on the new SQL Server instance to connect to the Oracle schema tables. This allowed the customer to use their .NET application against the new SQL Server instance without having to do a full conversion of the ten other Oracle schemas.

Check out the following video to see how we approached this non-trivial task and managed to reduce cost of the Oracle migration project.

Be sure to check out our new Jumpstart Your Database Migration Project with SSMA offer to get you up and running fast for your migration to Microsoft SQL Server.
Read the rest of this entry »

Global Retailer Tunes up OLAP Performance Ten-fold

Posted by | On December 15th, 2016 | In Business Intelligence, Customer Story, Data Management, SQL Server | Tags: , , , , ,
 

When a company grows, it’s no surprise that their database stored data grows as well. This is the case for all companies, and our customer, a global retailer in the beauty industry, was no exception. While working on this project, we managed to increase the average speed of query execution by 30x, saved up to 190 GB of memory and freed up 28 GB of disk storage space. Prior to these great results, there was a hard optimization process performed on the customer’s system based on the OLAP cubes technology.

We sped up the query execution by 107x. Freed up to 190 GB of memory, and improved the overall performance of the OLAP Cube by 3x. We managed to reach these fabulous results during a complex Big Data Management Project.

Watch the following video to see how we did it!

Read the rest of this entry »

AWS Schema Conversion Tool: Specific Features of Migrating Data Warehouses to Redshift

Posted by | On December 14th, 2016 | In AWS, Big Data, Business Intelligence, Database Migration | Tags: , , , , , , ,
 

This continues our video blog series on the AWS Schema Conversion Tool (SCT). In the previous blog post we’ve talked about the general approach of data warehouses migration to Amazon Redshift, and in this article, we are going to dig deeper into the DW migration specific features.

Before starting the migration of Greenplum, Netezza, Oracle or Teradata data warehouse to Amazon Redshift, you will want to understand the capabilities and limitations of Redshift.

This video will give you a better understanding of what you may encounter when migrating to Amazon Redshift.

Be sure to check out our new AWS Schema Conversion Tool Jumpstart offer to get you up and running fast for your migration to Amazon Redshift.
Read the rest of this entry »

AWS Schema Conversion Tool: General Approach to Migration of Data Warehouses to Amazon Redshift

Posted by | On December 6th, 2016 | In AWS, Big Data, Business Intelligence, Database Migration | Tags: , , , , , ,
 

This post continues our video blog series on the AWS Schema Conversion Tool (SCT). In our previous blog posts, we talked about using AWS SCT for transactional database migration projects. The AWS Schema Conversion Tool also supports migrating data warehouse (DW) workloads to Amazon Redshift.

AWS Schema Conversion Tool uses a different approach to DW migration projects compared to the transactional database migration workflow. With transactional databases, you typically have stored procedures, triggers and other database objects which deal with business logic in the database. With a data warehouse, you typically don’t have these types of objects. Instead, a data warehouse has huge volumes of historical, pre-aggregated data (with storage depth of 10-15 years). To improve query performance, you would typically have partitioned tables, materialized views and columnar tables that work with a star schema dimensional model.

Amazon Redshift is different from other data warehouse solutions in that there is no CREATE INDEX command. Instead, Redshift uses features like Sort and Distribution Keys to optimize query performance. In the following video, we’ll provide you an overview on how to migrate your Oracle, Teradata, Greenplum and Netezza data warehouses to Amazon Redshift using the AWS Schema Conversion Tool. The video also covers how AWS SCT provides guidance on creating the appropriate sort and distribution keys for you based on query statistics and run time query statistics from the source database.

Be sure to check out our new AWS Schema Conversion Tool Jumpstart offer to get you up and running fast for your migration to Amazon Redshift.

Read the rest of this entry »

SSMA for Oracle 7.0 — Overview of Data Migration Features

Posted by | On December 5th, 2016 | In Database Migration, SQL Server | Tags: , , , ,
 

DB Best uses a 12-step migration process to make sure all bases are covered as part of a typical migration project. Data migration is a critical part of a database migration projects and tools like SSMA can help in the process. In the following video, we will show you how to migrate data using the SSMA tool, as well as highlight errors that may occur during database migration.

 

Need help getting started? Check out our Jumpstart for SSMA offer!

Read the rest of this entry »

Countdown to AWS re:Invent – Come see us at booth #236

Posted by | On November 21st, 2016 | In AWS, Database Migration | Tags: ,
 

Countdown to AWS reInvent 2

AWS re:Invent 2016 is just days away, and DB Best is excited to be there November 29 – December 2. We’ll be at booth #236. Come see us! To help start the conversation, here is a list of our recent AWS Schema Conversion Tool blog posts: