![]() |
#1 |
Участник
|
sertandev: How to integrate with Dynamics 365 for Finance and Operations
Источник: https://devblog.sertanyaman.com/2020...nd-operations/
============== In this post I will give an up-to-date overview of D365 Finance and Operations integration options, mentioning their methods of integration and advantages and disadvantages of every integration method compared to each other. I will also give you information and advices from my first hand experiences using them. OData Odata is an open source protocol to serve and consume interoperable data using common query operations with RESTful APIs. D365FO exposes all its public data entities as OData endpoints which can then be accessed using following URI format : https:///data/OData provides a quick, codeless data integration method with many data query options and CRUD operations. You can use its open standard query string language to query your data and do data manipulations using standard Odata CRUD commands, all using just simple and open standard REST calls. If you would like to call a custom method within your data entity, this is also supported by exposing custom actions with your OData entities, so the commands are also extendable to a certain point. You can definitely use OData for your own integration projects, but there are also many OData-Ready software available today, and these can directly be connected to D365FO Odata endpoints. Microsoft Power BI also supports OData connection and you can connect Power BI using OData if you feel lazy with setting up faster data integration possibilities methods, like Entity Store and Data Lake integrations. Although it looks like the optimum way of data integration with D365FO, there are some drawbacks involved. OData queries and data operations are executed really slow and data reading may take ages if you try to retrieve a large entity. OData is mainly designed for simple CRUD operations and simpler queries. If you need to execute complex queries, like complex joins and lookups for example, you may start to hit its limits.. Although you can add some custom actions to extend available OData commands with your own ones, complex operations and business logic unfortunately do not go very well with it. It may be required to place this complex logic on consumer application if you decided to integrate it using OData. Remember you can also use OData endpoints in Azure API manager, just like custom services which I will mention soon.. ADVANTAGES
More info: https://docs.microsoft.com/en-us/dyn...entities/odata Custom Services This is far the most flexible and customizable way to integrate with D365FO and the one I use mostly in my integration projects. Custom services are created with standard D365FO X++ code and can be used for both data based and operation based (posting stuff etc.) integrations, limit being just your imagination. AIF services used in Dynamics AX are now upgraded to custom services in FO and they are automatically published into SOAP and JSON REST endpoints when deployed. However, management and monitoring tools used for old WCF AIF services are now disappeared, leaving just naked, not configurable services endpoints. When you deploy a custom service, following endpoints are created and you can call them remotely using standard AAD OAuth authorization and SOAP/REST HTTP calls: SOAP: https:///soap/services/?wsdlJSON: https:///api/services///Data integration in custom services is done via data contracts which are then converted to XML or JSON depending on the endpoint. Performance of doing so is quite fast, compared to other data integration options since everything is handled with low level .NET commands. You can in theory transfer big amounts of data this way however I do not recommend doing so. With large data transfers you may hit some not-foreseen limitations and timeout problems on the way. But it is still possible to program some kind of paging mechanism in your service class to send small chunks of larger data to skip these limitations as well. If you need to see an example, I have used a custom service in my previous Android integration blog post, and X++ code of this service is available in my github page: https://github.com/sertanyaman/nativ...droidTest.axpp If you need to do management and monitoring of your custom service, or configure some custom authorization methods instead of AAD, there is a possibility connect it with Azure API Manager and empower its monitoring and authoring abilities, as described in my another blog post : Empowering D365 FO service endpoints with Azure API Management service ADVANTAGES
More info: https://docs.microsoft.com/en-us/dyn...ustom-services DMF (Data management framework) DMF allows you to import and export big data in D365FO using various source/target data files and services. This is the recommended way of data integration in D365FO if you need to transfer large amounts of data from other systems using supported file formats and services. It supports also incremental data load/export with change tracking and a data staging feature for post data validations and processing. ![]() There is support for every old file format which was supported by Dynamics AX before, and some new formats and SQL Server DB connection options are available, but there is still no support for modern data transfer file formats like JSON, Parquet, ARRF, Avro etc.. Also as package options you can only use ZIP compression and there is no support for GZIP which is widely used for data transfer today. To transfer the data inside a file, CSV files are the recommended option. As you can see in the analysis below from luminousmen.com, a CSV file is quite an outdated file format and falls far behind on every aspect compared to modern data transfer file formats. https://luminousmen.com/post/big-data-file-formats DMF has REST APIs and Azure actions to import and export data from outside D365FO. You can find Azure actions to be used in Logic Apps under the “DataManagementDefinitionGroups” section of “Execute action” : ![]() There is currently no other direct file import or export method available, you need to zip it with those fixed Manifest and header files (which you also need to store in a fixed location) and setup workflows in the example code to import them. It would be very nice to have one straightforward option in the future though, that receives the data file as input and consolidated error and log messages as output.. Another thing to mention is running import jobs in parallel; if you call two APIs in parallel importing into the same data entity, you get errors and problems with it, although it is possible to execute such action using async DMF API methods, or async processings in logic apps etc. Also the table you need to export should not have any locks on it, or you run into some nasty problems. ADVANTAGES
More info : https://docs.microsoft.com/en-us/dyn...-data-packages Recurring integrations Recurring integrations is a D365FO data integration platform based on DMF and data entities, providing automated data exchange possibilities between third party data providers. Once setup, Recurring Integration platform REST API’s can be used by third party integrators to import/export data from and to D365FO. Recurring integrations feature can be enabled from a single button click in a DMF project and later can be managed/monitored from D365FO RI admin forms. REST API’s then used to push and pull data generated by this recurring integration, which can then be picked up in an ordered and scheduled way. These API’s used for recurring integrations are fixed and you cannot extend them for custom functionality. ![]()
DISADVANTAGES
More info : https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/recurring-integrations Business events D365FO business events feature allows you to send notifications of FO business events to Azure event handlers and trigger based workflow providers. It comes with many out-of-the-box FO business events plus provides you the ability to create new ones with X++ programming. You can send notifications from these events to Azure endpoints like Service bus, Event hub, Event grid and Logic apps (or Power Automate), also to custom HTTPS endpoints using standard http methods. It is also possible to subscribe a business event directly from Logic apps or power automate using “When a business event occurs” trigger, as shown below: ![]() If you need more information, I have a complete blog post about business events, describing also how to create a new event using X++: How to integrate D365FO with Microsoft Flow using the new Business EventsADVANTAGES
More info: https://docs.microsoft.com/en-us/dyn...ents/home-page Entity store Many get surprised when they hear D365FO is not really a SQL Server based software and its database logic and metadata are not directly accessible via SQL server. D365FO, in other names Axapta or AX, stores its metadata, record id mechanism, data type information and data relations in its own kernel and uses SQL server only as a bulk storage, which is then not directly readable by third party applications and analytics software. Dinosaurs like me would explain the reason for that as the ancient Damgaard Axapta was designed to be a “multi-database” application and supported Oracle DB as well as SQL server as a database options. So in that time, it was a decision to keep these metadata in the kernel instead and make it easier to switch databases if needed. So; to be able to publish normalized, third party readable SQL server data from D365FO, we need to use “data entities” and “aggregate measures” for data normalization and export their content to another SQL server database using the “Entity store” and BYOD functionalities. Entity store option publishes aggregate measures (start schema of data entities)(cubes, sort of) of D365FO in a separate SQL server database (AXDW) which can later be read by Azure analytical software (Power BI) to run analytical reports. ![]() ADVANTAGES
More info: https://docs.microsoft.com/en-us/dyn...n-entity-store Bring your own database (BYOD) BYOD allows you to export entity data to a user created SQL Server database instead of the Entity store AXDW, which is currently not allowed to be accessed by end users in production environments. Using BYOD you can export D365FO data entities to your database manually or with batch update or change tracking options using the standard DMF export projects page. To start using BYOD, you need to configure your SQL Server database in the “Data management” workspace “Configure Entity export to database” tile: ![]() Advantages
Disadvantages
More info: https://docs.microsoft.com/en-us/dyn...r-own-database Azure Data Lake A feature still in public preview, Azure data lake publishing provides a more practical, cheaper and faster storage option for your big analytical and integration databases which you used to access using Entity store or BYOD options before. It can publish data from entity store, data entities and standard FO tables to a “Data Lake Storage Gen2” storage, in a “near real-time” speed using the new D365FO “Data feed service”. This new data feed service works faster and saves you the hassle of maintaining DMF export jobs and batches. Publishing your data into a data lake storage brings in many benefits. It costs much less to store data in Data Lake compared to a SQL Server data space in Azure. Plus data lake allows you to combine your FO data with data from other applications, and also CDS inside the same data lake space and run consolidated reports with them. Data lake supports Common Data Model folder structure which can be understood by many Cloud applications and services, and it is also possible move the data from and to CDS using CDS Data lake integrations. Usage of Azure data transformation tools like Azure Data Factory is also possible. All in all, it brings you a couple of steps closer to Azure cloud integration and provides you the future proof “near-real time” data synchronization. In the time of writing, Azure data lake options is still in public preview and needs to be enabled from D365FO “Feature management” window first. Then you will be able to choose tables to be published to data lake from “Export to Data Lake” form and push Entity Store to Azure data lake from the Entity store window: ![]()
More info: https://docs.microsoft.com/en-us/dyn...-lake-overview Dual Write Dual write, one of the newest integration features of D365FO, provides a near-real time, tightly coupled and synchronous data transfer between D365FO and CDS. CDS sits at the heart of Microsoft Power Platform and used by many MS applications (also called model-driven apps). Dual write will eventually make it possible to synchronously integrate D365FO with other software in Dynamics 365 package and power platform components. Any data written to D365FO dual -write data entities is written directly into its CDS couple in near real-time speed, and similarly the other way around. Real time data synchronization process has “Running” and “Paused” mode options, and can cache write operations in offline or paused mode to be executed afterwards. At the time of writing, dual write setup can be done via LCS and from the Data management module section below. However, I will not get into details of the setup since it is subject to change in short term. Once setup, you first need to do an initial sync for the entities, then you can start using them in the real-time sync mode : ![]() ADVANTAGES
More info: https://docs.microsoft.com/en-nz/dyn...rite-home-page Links & resources https://docs.microsoft.com/en-us/dyn...ation-overview Источник: https://devblog.sertanyaman.com/2020...nd-operations/
__________________
Расскажите о новых и интересных блогах по Microsoft Dynamics, напишите личное сообщение администратору. |
|
|
Опции темы | Поиск в этой теме |
Опции просмотра | |
|