Working with customers over the past many years, the question of Data Architecture and Data Strategy has come up a lot. ‘What should my data strategy be?’ or ‘How do I go around creating a Data Strategy or a Data Architecture and what are some of the critical elements of a Data Architecture?’
Let me address some of those questions and the importance of having a well-defined Data Strategy and a Data Architecture. First and foremost, why do you need a Data Strategy and a Data Architecture? Well, the answer is simple: to bring some standardization, consistency, and an Architecture or a Plan to your world of data. The fundamental reason you need to do this is simple: Data is the primary asset of your organization and organizations that can make effective use of data will always have a competitive advantage. You do not want your data locked in data islands and in a tangled web where you cannot get to it in a timely manner and make use of it as and when you need it. Also, the infrastructure hosting the data has to be optimized, nimble, and efficient. A server sprawl or a server shortage are both undesirable. So, the key elements of a Data Strategy that help you arrive at a consistent Data Architecture are as follows:
We are excited to announce that DB Best/Microsoft partnership brings on a new series of workshops on SQL Server, SQL Azure and Windows Server upgrades for companies specializing in making and/or selling software. These workshops are a part of the ISV Upgrade program initiated by Microsoft Corporation to educate software companies about the new and improved capabilities of Microsoft Data Platform, and help them upgrade their applications to support Windows Server 2012 R2, Microsoft SQL Server 2014, and Microsoft SQL Azure.
The main goal of this program is to benefit the ISVs and their customers, and make their applications run faster, be more reliable, and more secure. This is where DB Best data management and modernization experts can contribute with their vast knowledge and experience by delivering 2-day workshops in different parts of the world: from North and South America to all parts of EMEA. During the sessions our experts present all the benefits and useful for ISVs features of Microsoft Data Platform such as HDInsight and In-Memory capabilities, and showcase how upgrading can optimize performance and accelerate move to the cloud when it makes business sense for the company.
One of the best parts of this program is that it provides ISVs with this training and unique hands-on experience deliver by DB Best subject-matter expert-consultants at no cost and, apart from the learning and reference material, provides access to development and testing environments, which let companie be very specific about their particular issues and questions.
If you are interested in learning more about these training sessions and getting an ISV consultation- please reach out to us, and we’ll be happy to answer all your questions!
We are incredibly excited about our partner’s, Microsoft Corporation, long-awaited release of Microsoft SQL Server 2014, the foundation of Microsoft’s comprehensive data platform. SQL Server 2014 offers a breakthrough performance for mission-critical applications, using enhanced in-memory technologies, delivering faster insights from any data to any user. On top of this SQL Server 2014 hybrid data platform enables the users to take advantage of various benefits of progressive and innovative cloud computing, such as cloud backup and cloud disaster recovery for on-premises SQL Server installations. Read the rest of this entry »
In less than 2 years DB Best professionals have written and posted over 100 technical articles on database migration and management related topics. These articles have gathered more than 20K views, thousands of likes and dozens of comments from various database management professionals from all over the world. In this post we would like to present a concise and informative review of our most rated articles:
At first database migration from SQL Server to SQL Azure may seem like a pretty easy thing to do. In fact when it comes to a small project of around 20 tables and about a dozen of stored procedures, this really is the case. However if you get a project of hundreds of tables, stored procedures, functions and triggers, if cross-database access is being used, and there is dynamic objects creation, then you will have to face a number of complex and interesting tasks which would require a not so obvious and simple approach, and solutions.
In this article I will try to address some of the challenges I have faced when developing SQL Server to SQL Azure migration solutions.
The first less complicated, but definitely not pleasant issue, is that SQL Azure requires each table to contain a clustered index, otherwise you won’t be able to insert any data to it. One would think: so what’s the big deal- you create a clustered index for each table, and proceed with the development. However this minor inconvenience is the core of the second problem: SQL Azure does not support statements like: Read the rest of this entry »
My last blog (Extract-Transform-Load (ETL) Technologies – Part 1) discussed the purpose, function, and some of the inherent benefits of ETL technologies for moving data from source applications into target reporting and analytic environments. Hopefully from that discussion one can gain some general understanding of ETL; what it is, how it can be used, and why we would want to use it. Now let’s dive a bit deeper and discover who some of the key vendors are in the ETL Tools marketplace.
ETL tools have been around for some time, have evolved, matured, and now present us with productive environments for Big Data, Data Warehouse, Business Intelligence, and analytics processing. The ETL data processing cycle itself can be simple or highly complex. Most vendor offerings address the diverse requirements well enough although some do better than others. The challenges with ETL tools come from the other important considerations, like:
One of our core value propositions at DB Best is to help our clients lower the costs of running their enterprise level applications and be more productive. After 10 years in the database migration business, we now have top experts, smart tools and established methodology which allow us to seamlessly switch apps over to Microsoft products. Let us give you an example of a very successful 3-week Siemens Teamcenter Oracle to Microsoft SQL Server migration project.
Many of our customers ask: “If my application runs and I’m happy, why do I need to migrate?” Being in the database migration business for over 10 years now, we’ve found that a value proposition based on hardware savings, licensing advantages, and professional relationships – used together or separately – makes customers interested in moving to the SQL Server platform.
As you probably know, our key migration offerings include conversion of Oracle, Sybase ASE, Sybase ASA, DB2, DB2 UDB, MySQL, Informix, Access applications and databases to SQL Server 2005/2008. Watch this video to learn more about our migration package:
So now we at DB Best have a packaged migration offering that helps us look at this whole end-to-end migration as one project. As a part of the service, we offer Portfolio Assessment, a 12-step migration methodology that allows us to estimate the cost and effort required for application migration. It also enables us to scan the entire environment to do a quantitative analysis of the database. Based on the findings, we deliver a detailed document – internally, we call it Technical Roadmap – that describes all the steps involved in the migration of that database to SQL Server. Read the rest of this entry »
My last blog (Column Oriented Database Technologies) discussed the differences between Row and Column oriented databases and some key players in this space. Concepts and technologies on Big Data have been discussed in previous blogs (Big Data & NoSQL Technologies & NoSQL .vs. Row .vs. Column). From these blogs one should surmise that deciding upon the best database technology (or DBMS vendor) really depends on schema complexities, how you intend to retrieve your data, and how to get it there in the first place. We’re going to dive into this next, but before we do it is imperative that we briefly examine the differences between OLTP and OLAP database designs. Then let’s leave OLTP details for a future blog as I expect most readers already know plenty about transactional database systems. Instead we’ll focus here on OLAP details and how we process Big Data using ETL technologies for data warehouse applications.
Generally the differences between OLTP and OLAP database applications center upon how frequently data must be stored and retrieved, the integrity of that data, and how much of there is and its growth. OLTP database schemas are optimized for processing transactions by an increasing number of users while OLAP database schemas are optimized for aggregations against an increasing amount of data and exponential query permutations. Design considerations involve normalization, indexing, datatypes, user load, storage requirements, performance, and scalability. We will need to defer the many interesting details on these considerations for a future blog.
We have a new product in our database migration lineup – MySqlMigrator, a neat tool that helps you transfer data from MySQL tables to existing SQL Server 2008/2012 tables in a few steps. It comes with a wizard-based interface that allows restarting migration from any step. The tool uses the bulk copy mechanism for loading data from MySQL tables to SQL Server.
Here’s a quick video tutorial to get you up to speed on what MySqlMigrator can do for you: