BI4Dynamics is ready to give you total insight into your business the next day after installation. The unique data transformation with extra data layers ensures no compromising between service costs and BI capabilities.
Data transformation from D365F&O to Power BI & Excel
The unique data transformation with Data Lake and Synapse ensures first in class business intelligence from D365F&O. BI4Dynamics automated process builds Data Warehouse, and Analysis Services ensure powerful analysis on the document level without sacrificing the speed of data processing and querying in less than a week.
Data transformation is done automatically in the following steps:
- Preparation: Export F&O tables to the Data lake and insert them into Data Warehouse.
- Transformation: Data Warehouse Automation with best-in-class content.
- Consumption: Analyze data with visualization tools, such as Power BI and Excel.
Export to Data Lake extension & Data Warehouse Table Insert
The preparation phase starts with installing an extension into the F&O solution. Therefore, the Export to Lake extension is set up, together with an Azure Storage account and Power BI account.
The installation of the Export to Lake extension is necessary for the successful running of selected tables that are landing as CSV files in Data Lake Storage. These CSV files of the Data Lake Storage cannot be usefully consumed with Power BI, hence, the choice without any competition for reading these files is Azure Synapse. Azure Synapse is reading the CSV files and materializing/inserting these tables into SQL Data Warehouse.
The refresh time is not scheduled and it cannot be affected by BI4Dynamics, since it is set up by Microsoft. Moreover, the Export to Lake operation is not affecting the production process.
On the other hand, we are very optimistic and confidently expect that the current daily updates of 4 times will be live in the second half of 2022.
Data Warehouse Automation & Rich Power BI content
Data Transformation is the process where BI4Dynamics takes raw data that landed in the staging area of F&O in a CSV form and processes it with help of 2100+ KPIs and transforms it into insightful and rich content from which end-users can inform and take important business decisions.
The SQL engine is super scalable and can process terabytes of data in just a couple of hours. The engine can digest any model and the whole process is fully automated, hence, you don’t have to worry whether this will be affected by the user or any other environment.
A big advantage is the fact that there is no need for IT involvement in the modeling of the reports, the only importance is the extent of your F&O knowledge.
Moreover, instead of showing you the SQL scripts, we are showing you the rich content we offer. This content is built on 15 years of experience and supports all application areas.
Would you like to experience the same content that we are showing you? – No worries!
Go to our Power BI Account and use the following credentials:
- Username: demo@bi4dynamics.com
- Password: PBIweL0Ve
Semantic Layer – Excel, Power BI, or another tool
The last part of our architecture is Data Consumption. The consumption section is a business/semantic layer where users will connect with Excel, Power BI, or any other tool.
Since you have already seen the user experience in the previous section, here only the implementation options of the business layer are explained.
Regardless of which option is selected, in every case, BI4Dynamics is hosting the data in a tabular model. On the other hand, the hosting environment can be:
- On-Premises – Analysis services are available on existing hardware and it has only local access (No web)
- Azure Analysis Services by Microsoft – Paid per service, not per user! Furthermore, it depends also on the data size and the query performance provided to the user, based on this a decision is made for choosing the most suitable service.
- Power Bi Premium Capacityby Microsoft – Paid per user, not per server!
Compared to AAS, the user does not have to pay 5000$ per month, it can host bigger models, it is very performant, provides web & mobile access and it offers the same service as the pro account.
All of this gives you control over the cost and features without requiring you to compromise content over services. Moreover, you can always go to the document level, you don’t have to do service on aggregations and the data size can start with Giga or even Terabytes. The SQL will crunch any data and the semantic layer will aggregate this data.
- Arhitecture
-
Data transformation from D365F&O to Power BI & Excel
The unique data transformation with Data Lake and Synapse ensures first in class business intelligence from D365F&O. BI4Dynamics automated process builds Data Warehouse, and Analysis Services ensure powerful analysis on the document level without sacrificing the speed of data processing and querying in less than a week.
Data transformation is done automatically in the following steps:
- Preparation: Export F&O tables to the Data lake and insert them into Data Warehouse.
- Transformation: Data Warehouse Automation with best-in-class content.
- Consumption: Analyze data with visualization tools, such as Power BI and Excel.
- Preparations (Source)
-
Export to Data Lake extension & Data Warehouse Table Insert
The preparation phase starts with installing an extension into the F&O solution. Therefore, the Export to Lake extension is set up, together with an Azure Storage account and Power BI account.
The installation of the Export to Lake extension is necessary for the successful running of selected tables that are landing as CSV files in Data Lake Storage. These CSV files of the Data Lake Storage cannot be usefully consumed with Power BI, hence, the choice without any competition for reading these files is Azure Synapse. Azure Synapse is reading the CSV files and materializing/inserting these tables into SQL Data Warehouse.
The refresh time is not scheduled and it cannot be affected by BI4Dynamics, since it is set up by Microsoft. Moreover, the Export to Lake operation is not affecting the production process.
On the other hand, we are very optimistic and confidently expect that the current daily updates of 4 times will be live in the second half of 2022. - Transformation
-
Data Warehouse Automation & Rich Power BI content
Data Transformation is the process where BI4Dynamics takes raw data that landed in the staging area of F&O in a CSV form and processes it with help of 2100+ KPIs and transforms it into insightful and rich content from which end-users can inform and take important business decisions.
The SQL engine is super scalable and can process terabytes of data in just a couple of hours. The engine can digest any model and the whole process is fully automated, hence, you don’t have to worry whether this will be affected by the user or any other environment.
A big advantage is the fact that there is no need for IT involvement in the modeling of the reports, the only importance is the extent of your F&O knowledge.Moreover, instead of showing you the SQL scripts, we are showing you the rich content we offer. This content is built on 15 years of experience and supports all application areas.
Would you like to experience the same content that we are showing you? – No worries!
Go to our Power BI Account and use the following credentials:
- Username: demo@bi4dynamics.com
- Password: PBIweL0Ve
- Consumption (Business Layer)
-
Semantic Layer – Excel, Power BI, or another tool
The last part of our architecture is Data Consumption. The consumption section is a business/semantic layer where users will connect with Excel, Power BI, or any other tool.
Since you have already seen the user experience in the previous section, here only the implementation options of the business layer are explained.Regardless of which option is selected, in every case, BI4Dynamics is hosting the data in a tabular model. On the other hand, the hosting environment can be:
- On-Premises – Analysis services are available on existing hardware and it has only local access (No web)
- Azure Analysis Services by Microsoft – Paid per service, not per user! Furthermore, it depends also on the data size and the query performance provided to the user, based on this a decision is made for choosing the most suitable service.
- Power Bi Premium Capacityby Microsoft – Paid per user, not per server!
Compared to AAS, the user does not have to pay 5000$ per month, it can host bigger models, it is very performant, provides web & mobile access and it offers the same service as the pro account.
All of this gives you control over the cost and features without requiring you to compromise content over services. Moreover, you can always go to the document level, you don’t have to do service on aggregations and the data size can start with Giga or even Terabytes. The SQL will crunch any data and the semantic layer will aggregate this data.
Implementation options
Virtual machine (VM)
VM hosts SQL server that is used for Data Warehouse. Virtual Machine can be paused when data are not being processed in BI4Dynamics Data Warehouse. Most of the day since data warehouse is built within a couple of minutes to a couple of hours.
On-Premises
BI4Dynamics Data Warehouse can be processed on an On-Premises machine with SQL server no matter where the source database is located. This option is good for companies with existing infrastructure.
On-Premises
BI4Dynamics Analytical database can be processed on On-Premises machine with SQL server no matter where the source database is located. This option is good for companies with existing infrastructure. But has limitations with web access.
Azure Analysis Services
Azure Analysis Services are paid per service – per hour. The service costs increase with the database size and hours needed for analysis. Best choice for databases under 20 GB. It offers web access to the database.
Power BI Premium
Analytics runs on Power BI Premium per user which is charged per user, not per server. It is suitable for database model up to 100GB. Best choice for up to 20 users. Offers mobile and web access and all other Power BI features.