Mobile applications industry delivers good news: augmented reality (AR), voice control, virtual reality (VR) and other amazing technologies become more and more popular. More importantly, modern smartphones have enough power to process the graphics for AR and VR. You can simply insert your smartphone into an affordable VR headset in order to dive into virtual reality. So, the most part of smartphone users now can access this variety of modern breathtaking technologies. Moreover, these technologies are not expensive and they do not require extra power from a mobile device.
To see how the DB Best powered AR & VR application works, please, watch the following video.
We’re confident that you’re already wondering how you can use these incredible technologies. DB Best can work with your company to create a similar application for your business in a very short time frame. Contact us now to learn how we can bring your ideas to life and implement them in an AR & VR application for your business.
Visiting an exhibition, you may want to use a specific mobile application developed for the particular event. Especially if we’re talking about an airplane show, which tend to cover large areas. During the show, you would definitely like to know exactly where you are, what’s around you, and where else to go. Using map services to implement the previously mentioned features is a fairly simple and common task. But how do we get information about the various exhibits? Well, here we have an opportunity to integrate a truly fascinating IoT device. We used Apple’s iBeacon technology as part of our Live Airshow mobile application to enhance the attendee’s experience..
Please watch the following video to learn more about this technology in particular, and other features of the Live Airshow application.
AWS Database Migration Service helps you migrate databases to AWS easily and securely. The AWS Database Migration Service can migrate your data to and from most widely used commercial and open-source databases. The service supports homogenous migrations such as Oracle to Oracle, as well as heterogeneous migrations between different database platforms, such as Oracle to Amazon Aurora or Microsoft SQL Server to MySQL.
As part of DMS integration, SCT introduced support for AWS profiles where users can store credentials for Amazon S3, AWS Lambda and DMS using the AWS Schema Conversion Tool interface. Finally, SCT now supports Amazon Aurora with PostgreSQL as a target database for conversion and migration projects.
In order to use Data Migration Services inside the AWS Schema Conversion Tool, you must specify the required credentials. You can store the credentials in the corresponding Service Profile, which can be set up in the general application settings. To do so, click the Settings button, choose Global Settings, and go to the AWS Service Profiles tab.
Here you can add service profiles to enable access to other AWS resources (DMS, RDS, Lambda). Later, you will be able to switch between these profiles directly in the application.
To set up the service profile, you have to specify its name, AWS Access and Secret keys, and the region. After the setup is completed, press the Save button.
Before you can actually use the Data Migration Service, integrated into the Schema Conversion Tool, there are a few things that you need to address. First, you have to establish a connection to the source and target databases. Then you must convert schemas and apply them to the target database. This step is needed since DMS expects the tables to exist in the target database.
When this is done, you can move forward to creating a DMS task. To do so, just select the objects to migrate data from the source metadata tree and click Actions — Create DMS task. Also, this option can be chosen from a pop-up menu after you right-click on selected objects.
On the next step, you must specify the needed parameters to start the data migration process. They are the task name, the replication instance, and the source and target endpoints. Also, you can include or exclude the LOB columns, and enable logging for the operation.
When everything is set and the endpoints are created, just press the Create button. You will get to the Data Migration View window.
The data migration task will be put into the queue, however it should still be launched manually.
You can track its progress in the Data Migration View or switch back to your SCT project.
When the data load is completed, we use Database Compare Suite to compare the tables in source and target databases.
As you can see, the data migration operation was completed successfully.
Summarizing all features implemented in the new version of the AWS SCT, we can see that the tool is not only getting new features, but it is also becoming more user-friendly and intuitive. The integration of Data Migration Service for OLTP databases into AWS Schema Conversion Tool is an expected and long-awaited update. Now you can perform and control all operations of an OLTP database migration project in one application — you don’t need to switch to another window anymore. For a complete overview of the new capability from AWS, check out Working with the AWS Database Migration Service Using the AWS Schema Conversion Tool.
Back in March of 2015, DB Best published a blog post on how to embed Power BI Power View interactive reports into your blogs and websites. Much has changed in Power BI since then to make it possible to easily embed reports into your blog articles. There are now two basic options for embedding reports. Power BI Embedded is a service on Azure for publishing Power BI visuals and uses authentication to make sure users can see the data they are suppose to. The other method is to use the new Publish to web feature introduced in the October 2016 release of Power BI. The downside is that there is no authentication used when viewing the report. For the most part, this is perfect for blog posts and social media. So you have a really cool Power BI Report that you want to share with others. You could always do a static screenshot of the dashboard like the one below.
When talking about migration from Oracle to SQL Server, we always look for ways to work with our customers to keep costs under control. We recently completed an Oracle to SQL Server migration proof of concept project for a government organization in South America. What made this project unique was the way we used SQL Server Migration Assistant (SSMA) to convert one of eleven Oracle database schemas that were running on a 12-node Oracle RAC cluster to SQL Server. The other ten schemas remained in place and we used SQL Server Linked Servers on the new SQL Server instance to connect to the Oracle schema tables. This allowed the customer to use their .NET application against the new SQL Server instance without having to do a full conversion of the ten other Oracle schemas.
Check out the following video to see how we approached this non-trivial task and managed to reduce cost of the Oracle migration project.
When a company grows, it’s no surprise that their database stored data grows as well. This is the case for all companies, and our customer, a global retailer in the beauty industry, was no exception. While working on this project, we managed to increase the average speed of query execution by 30x, saved up to 190 GB of memory and freed up 28 GB of disk storage space. Prior to these great results, there was a hard optimization process performed on the customer’s system based on the OLAP cubes technology.
We sped up the query execution by 107x. Freed up to 190 GB of memory, and improved the overall performance of the OLAP Cube by 3x. We managed to reach these fabulous results during a complex Big Data Management Project.
This post continues our video blog series on the AWS Schema Conversion Tool (SCT). In our previous blog posts, we talked about using AWS SCT for transactional database migration projects. The AWS Schema Conversion Tool also supports migrating data warehouse (DW) workloads to Amazon Redshift.
AWS Schema Conversion Tool uses a different approach to DW migration projects compared to the transactional database migration workflow. With transactional databases, you typically have stored procedures, triggers and other database objects which deal with business logic in the database. With a data warehouse, you typically don’t have these types of objects. Instead, a data warehouse has huge volumes of historical, pre-aggregated data (with storage depth of 10-15 years). To improve query performance, you would typically have partitioned tables, materialized views and columnar tables that work with a star schema dimensional model.
Amazon Redshift is different from other data warehouse solutions in that there is no CREATE INDEX command. Instead, Redshift uses features like Sort and Distribution Keys to optimize query performance. In the following video, we’ll provide you an overview on how to migrate your Oracle, Teradata, Greenplum and Netezza data warehouses to Amazon Redshift using the AWS Schema Conversion Tool. The video also covers how AWS SCT provides guidance on creating the appropriate sort and distribution keys for you based on query statistics and run time query statistics from the source database.
DB Best uses a 12-step migration process to make sure all bases are covered as part of a typical migration project. Data migration is a critical part of a database migration projects and tools like SSMA can help in the process. In the following video, we will show you how to migrate data using the SSMA tool, as well as highlight errors that may occur during database migration.
AWS re:Invent 2016 is just days away, and DB Best is excited to be there November 29 – December 2. We’ll be at booth #236. Come see us! To help start the conversation, here is a list of our recent AWS Schema Conversion Tool blog posts: