• Retrieve more than 5000 D365 records by using pagination in Power Automate

    In my last article, I pointed out that the used method for the implementation of paging is very limited. It is possible to have a working solution for all purposes - only by adding a few new actions to the existing flow.

    NextLink contains skiptoken parameter

    We remember the fact that the list records action of CDS (current environment) always returns an @odata.nextLink property, when the defined filter criteria leads to more than the defined maximum page size (default = 5,000). If so, the @odata.nextLink URL value of CDS action will contain a skiptoken query parameter. This query parameter can be used to implement paging in Power Automate using CDS (current environment) action, even it is not recommended in the docs article. Just retrieve and decode the skiptoken parameter from the @odata.nextLink URL and use it as input parameter for skiptoken in List actions of CDS.

    #Dynamics365CE #CDS #PowerAutomate #Pagination #Paging #Skiptoken

    Continue reading...

  • Bulk deletion job for D365 using scheduled flow

    While planning of the implementation of relevant bulk delete jobs for all integrated D365 organizations, the unchanged limitations made me sad.

    Bulk deletion jobs have annoying limitations

    Initially, this feature does not support the deletion of activities except you would create the job programmatically using Web API. Next point is, that you cannot change the filter criteria after creation. Furthermore, it is not possible to add it to a solution as a component. In addition, it does not support multithreading. Lastly, an email notification could not be limited to a failure case. In conclusion, I decided to check the possibilities in Power Automate.

    Recurring flow is powerful and supports CI/CD

    Es wurde kein Alt-Text für dieses Bild angegeben.

    A flow does not have any of the limitations listed above. So I decided to delete data in bulk using a recurring flow. The challenge was, to find a way to delete more than 100k records. The solution is to implement pagination. To do so, I am using @odata.nextLink and the Do until control. The list records action of CDS (current environment) always returns an @odata.nextLink property, when the defined filter criteria leads to more than the defined maximum page size (default = 5,000). The implemented flow is going to retrieve and delete those until the property is not returned. Please note, that this pagination design only works, if the retrieved data is deleted before retrieving the next page. I am looking for a working alternatives for other use cases and will publish an article, after I have one using the CDS current environment actions.

    Concurrency control is the key for dynamic performance

    Es wurde kein Alt-Text für dieses Bild angegeben.

    The list records action of CDS returns a list of items. The best way to process each item is to use the apply to each action. Per default those items will be processed one by one. It is possible to change this behavior in the action settings. The maximum degree of parallelism is 50 and worked in many cases without reaching any API limits. But it might change after the full support of API request capacity add-on.

    This is how the flow looks like

    #Dynamics365CE #BulkDeletionJob #PowerAutomate #Pagination #ConcurrencyControl #Parallelism #Multithreading

    Continue reading...

  • Database capacity use is gradually growing caused by marketing activities

    • The Marketing Activity entity is a technical entity, to store information about metadata published to D365 from marketing services.
    • Data older than 1 week can be deleted.

    To free up storage, I was going through all D365 organizations and looking for potential candidates of information that can be deleted. In doing such, I discovered a new entity called Marketing Activity (msdyncrm_mktactivity). It is used for storing details from metadata sync process of D365 Marketing service user. This user is performing internal tasks like segment updates (adding new members to the segment based on filter criteria). The result of each internal task is stored as a marketing activity. Based on the number of active marketing elements like segments, customer journeys and marketing forms, it can grow up in a very short time period - noticeable.

    Microsoft Support is in demand

    Before creating a new support request, I performed a quick search on the internet and found a forum entry about this topic. The solution of the Microsoft Support is easy to implement:

    To reduce the msdyncrm_mktactivity table, you can run a bulk delete job (or schedule it to run periodically) to delete data that's older than a week. We currently don't have any system jobs that will automatically clear this table, however we will have it in the future; but we don’t have the exact date though.

    Therefore, I decided to create a flow using Power Automate to delete successfully completed marketing activities older than 1 week. Other activities in a different status will be deleted, if they are older than 4 weeks. This gives us the possibility to have a better overview of failed internal tasks - actually there aren't any :).

    Flow has a few advantages over bulk delete jobs

    I will write in my next article about the created flow to delete any data from D365. I prefer the Power Automate service over bulk delete jobs, because

    • it has a better error handling,
    • it can access more entities such as activities
    • and it allows you to include the components like flows into a solution (CI/CD).

    #Dynamics365Marketing #MarketingActivity #Storage #PowerAutomate

    Continue reading...