Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog

Channel Description:

A one stop shop where the Microsoft Dynamics ecosystem can learn, share, connect and network with others within the Community. Peer to Peer discussions , product demonstrations, blogs & videos.

older | 1 | .... | 1121 | 1122 | (Page 1123) | 1124 | 1125 | .... | 1174 | newer

    0 0

    Imagine…One app platform for Office 365 and Dynamics 365. Well, now you don't have to imagine anymore. Microsoft Power Platform is here!

    What is the Microsoft Power Platform?

    Power Platform combines the robust power of PowerApps, PowerBI, and Microsoft Flow into one powerful business application platform -  providing quick and easy app building and data insights. Each component of the Microsoft Power Platform is built on the the Common Data Service for Apps. Each component is dynamic by itself, but brilliant and masterful when combined.

    The Microsoft Power platform brings all your data together into a common data model.

    This blog is part one of a five part series. In this blog, we will discuss what the Microsoft Power Platform is. In parts, two, three, and four we will discuss each component of the Power Platform. In the last and culminating blog, we will discuss the PowerApps and how they tie all of these components together.


    Common Data Service

    The heart of the Microsoft Power Platform is the Common Data Service for Apps(CDS). CDS is a secure database hosted in Azure Cloud prebuilt with a standard set of entities and record types. These record types – for example, Accounts, Contacts, and various activity types are extensible – so you can add additional data fields. Developers can also add new entities to fit their business needs. Entities have relationships to each other, and Business Rules can be created to make fields required, to hide fields and to set default values.

    Microsoft Flow

    Microsoft Flow is a service that helps you to create automated workflows between apps and services. These workflows can be used to integrate and update data, synchronize files, get notifications and more. There are over 200 apps and services, including Common Data Service for Apps, that work with Microsoft Flow today and that number is always growing.


    PowerBI is a business analytics service provided by Microsoft. Using data stored in CDS or other databases, users can build informative reports and dashboards to display important data about sales, customer service, and other business functions. These dashboards and reports can be published on websites, in SharePoint or Teams, and in Apps.


    PowerApps build on the platform of CDS and Flow to allow citizen developers to build easy to use applications for standard business needs. For example, you want to build an app for representatives to inspect your companies franchises. A Power App could be built and made available on the rep's phone that surfaces your survey and inspection questions. The rep could visit the franchise and quickly fill out the data from their phone. This data is then stored in the CDS. Flow would trigger notifications of tasks identified to repair defects at the franchises. And PowerBI dashboards would allow you to rank stores against each other.

    Microsoft has moved administration of the Power Platform out of Office 365 and to its own site, You can access the administration site at You can also start free trials for all of these services at the same URL.

    Over the next few weeks will dive deeper into the Power Platform. Please comment below any questions you might have and we will try to address them.

    The post An Introduction to the Microsoft Power Platform appeared first on CRM Software Blog | Dynamics 365.

    0 0


    So, you’re building a custom SSRS report using FetchXML and you have the requirement to display images from Notes in Dynamics 365.

    (If you need some initial setup assistance, see one of our previous blogs, Quotes: How to Display an Image Stored in Dynamics 365 on an SSRS Report, for help.)

    Now, fast forward to deploying and testing your report within Dynamics 365. If everything looks great except the images didn’t render properly, you’re probably thinking, “What the heck?!” Perhaps you see a little red X, maybe the image is only half-loaded or pixelated, or it could be a combination of these. Regardless, it’s frustrating! Here are a few potential quick solutions for image rendering issues that may be just what you need…

    1. Review the header of your FetchXML behind the dataset bringing those images into the report: is distinct=”true” in there? If so, this is likely the source of your image rendering problem. Set it to false or, better yet, simply remove the whole condition if you don’t need it.

    2. When it comes to rendering images nicely, Dynamics 365 has a size limit of 75 KB. It might render an image over 75 KB, but it won’t like doing it and it you probably won’t like the results! As a safeguard against inadvertently attempting to display an image that exceeds the limit, consider adding a Row Visibility expression to hide an image greater than 75 KB. For example: =IIF(Fields!NOTE_filesize.Value/1024 > 75, True, False) where this would be Set expression for: Hidden.

    3. Another solution is to avoid this size limitation altogether by reducing the size of your images within Dynamics 365. Now, there is no way to reduce the size of the original image that is added, but you can trigger a plugin that makes a copy of the image and reduces its size in the process. Most likely, your images are attachments onto Notes, and your plugin can have a logic flow like this:

    IF Note contains an image attachment

    IF image file size < configurable max value



    – Reduce the quality by configurable percentage and shrink the image (maintaining scale) so the largest side matches the configurable value; repeat if still too big

    – Replace original image with newly reduced image

    This, along with collaboration of your Row Visibility expression from #2 above, will prevent those big, unwanted images from trying to render and display in your report.

    Hopefully these solution(s) help clear up some image rendering issues. Be sure to subscribe to our blog for more Dynamics 365 tips and tricks!

    Happy D365’ing!

    0 0

    The Affordable Care Act was passed on March 23, 2010 and employer compliance requirements began in the 2015 tax year, yet many employers have yet to comply. Are you one of those employers with your head in the sand, not filing with hopes you'll remain unscathed? Many employers have been, and continue to be in denial that the IRS will enforce the mandate of the ACA for employers to offer compliant, affordable health insurance to their qualified employees.

    In 2016, after the election of Donald Trump as President and seeing a Republican controlled Senate and House of Representatives, employers were lulled into complacency and used the political rhetoric of the day and health insurance instability to justify their inaction. The Executive Order signed by President Trump muddied the compliance field further as did the many repeal & replace legislative activities. As a result, many employers are still not in compliance with the ACA. Each day, employers are coming to the realization that compliance is no longer optional.

    As far back as January of 2017, it was clear that the ACA penalties were coming and could reach $31 billion per year. If you are one of the employers who believed that they could avoid compliance, think again. You need to file; the penalties are coming. We have helped several employers deal with penalties for not filing their required ACA forms and for not fully complying with the ACA requirements.

    As these non-filing employers came to realize, ACA compliance can be challenging, especially when attempting to get into compliance after years of avoidance. Remember, there is no relief for intentional non-compliance with IRS regulations. That’s where Integrity Data has come to the rescue for many employers, assisting them not only with their current year compliance activities, but going back and getting their filings completed for prior years. For many of these employers, we found that they had a strategy that met the minimum compliance requirements, they just didn’t complete their annual required IRS filings. As a result, they were putting their organization at risk of multi-million dollar penalty assessments.

    Now we are finding that the mid-term elections have caused more and more employers to finally decide that it’s time to begin complying with the ACA. Do you need help? If so, we have several options that make compliance easy. Whether that is providing you a solution to complete your filing activities on your own, getting a little help from our team, or throwing it over the fence for our team to complete on your behalf.

    We are here and ready and we can get started today. Contact us for more information.

    0 0

    I recently discovered a strange quirk with the DataArea table in AX2012. DataArea table The DataArea table contains the list of all legal entities. All companies that you can see in the address bar...(read more)

    0 0

    It’s not all about solutions, workflows, configuration changes, etc. Sometimes, we just need to move data from one environment to the other before making any configuration changes, and, of course, we need...(read more)

    0 0

    The following procedure describes and shows steps to install full-text search feature using SQL Server setup wizard. Full-text search feature must be installed right after SQL server installation is done...(read more)

    0 0

    The other day I was getting this weird little error when building a canvas app. Basically, all of my CDS Optionsets just randomly stopped working. I hadn’t actually realised that I had left the “Experimental...(read more)

    0 0

    Recently I attended the Dynamics 365 Saturday event in Stockholm and I have to say, what an excellent event. I have never been to Stockholm, so I was already massively excited about this. I also got to...(read more)

    0 0

    This is part 5 of a blog post series. Part 1 contains all the prerequisites, part 2 is about cloning the project and get your sandbox environment up running, part 3 is about build agents, and building...(read more)

    0 0

    The productivity of an organization depends prominently on the efficiency of its employees. They form a crucial part of the business process by handling customer data and delivering services. For the employees...(read more)

    0 0

    To install Microsoft Dynamics 365 Business Central On Premises, first obtain the installation DVD. You can download it here: Then, select your flavor (localization): more

    0 0

    After you have installed Business Central On Premises, as explained in the previous blog post, you will notice that Personalization is not enabled: I don’t know why, if this is by design or not, but you can enable it as follows. First, find more

    0 0

    While working with BPF fields using JavaScript, there can be instances where you might want to keep the field locked in a certain stage but unlocked in another stage. And using Xrm.Page.getControl(controlName...(read more)

    0 0

    This post will show you how quickly and easily you can setup a cloud storage, and then copy the database around between your environments. Having said that, we are waiting on this feature in LCS, and eventually there will be tooling that does this for us in a fully managed way. However, while we are waiting, we can set this up ourselves.

    Setup the Storage Account

    You will (obviously) need an Azure Subscription for this to work. All of the steps below can be completed using a PowerShell script, so the advanced users will probably write that up. But I will here show have you can easily get this done with some clicking around. Still, you can set this all up in matter of minutes manually.

    Start with opening the Azure Portal and open "Storage Accounts". You will create yourself a new one.

    You will ned to choose a Resource Group, or create a new one. I typically have a Resource Group I put "DynOps" stuff in, like this Storage Account.

    I want to make this a cheap account, so I tweak the settings to save money. I opt for only Local Redundancy and a Cold Tier. Perhaps the most important setting is the Region. You will want to choose a region that is the same as the VMs you are using. You get better performance and save some money (not much, though, but still).

    Oh, and also worth mentioning, the account name must be unique. There are a few naming guidelines for this, but simply put you will probably prefix it with some company name abbreviation. If you accidentally pick something already picked, you won't be able to submit the form, for good measure.

    It only takes a few minutes for Azure to spin up the new account, so sit back, relax and take a sip of that cold coffee you've forgot to enjoy while it was still warm.

    The next thing you'll do is open the newly created Storage Account, and then scroll down on the "things you can do with it" and locate "Blobs". You will create yourself a new blob, give it a name, like for example "backups" or just "blob". Take not of the name, as you will need it later.

    Then you will want to get the Access key. About the Access key, it needs to be kept as secret as possible, since it basically grants access to the things you put into this Storage Account. If you later worry that the key has been compromised, you can regenerate the Access key, but then your own routines will have to get updated as well. There are some other ways to secure usage of the Storage Account, but for the sake of simplicity I am using the Access key in this example.

    And now you are set. That entire thing literally takes just a few minutes, if the Azure Portal behaves and you didn't mess anything up.

    Using the Storage Account

    I've become an avid user of the PowerShell library, so for the next example I will be using it. It is super easy to install and setup, as long as the VM has an Internet connection. I'm sure you will love it too.

    Assuming it is installed, I will first run a command to save the cloud Storage Account information on the machine (using the popular PSFramework). This command will actually save the information in the Registry.

    # Fill in your own values
    $params = @{
    Name = 'Default' # Just a name, because you can add multiple configurations and switch between them
    AccountId = 'uniqueaccountname' # Name of the storage account in Azure
    Blobname = 'backups' # Name of the Blog on the Storage Account
    AccessToken = 'long_secret_token' # The Access key

    # Create the storage configuration locally on the machine
    Add-D365AzureStorageConfig @params -ConfigStorageLocation System -Verbose

    Now let's assume you ran the command below to extract a bacpac of your sandbox Tier2 environment.


    $dbsettings = Get-D365DatabaseAccess

    $baseParams = @{
    DatabaseServer = $dbsettings.DbServer
    SqlUser = 'sqladmin'
    Verbose = $true
    $params = $baseParams + @{
    ExportModeTier2 = $true
    DatabaseName = $dbsettings.Database
    NewDatabaseName = $($dbsettings.Database + '_adhoc')
    BacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'

    Remove-D365Database @baseParams -DatabaseName $($params.NewDatabaseName)
    New-D365Bacpac @params

    You now want to upload the bacpac (database backup) file to the blob in your cloud Storage Account using the following PowerShell script.

    Set-D365ActiveAzureStorageConfig -Name 'Default' 

    $StorageParams = Get-D365ActiveAzureStorageConfig
    Invoke-D365AzureStorageUpload @StorageParams -Filepath 'D:\Backup\sandbox_adhoc.bacpac' -DeleteOnUpload

    The next thing you do, is jump over to the VM (Tier1, onebox) where you want to download the bacpac. Obviously, must be installed there as well. Assuming it is, and assuming you've also run the command above to save the cloud Storage Account information on the machine, you can run the following PowerShell script to download.

    Set-D365ActiveAzureStorageConfig -Name 'Default' 

    $StorageParams = Get-D365ActiveAzureStorageConfig
    Invoke-D365AzureStorageDownload @AzureParams -Path 'D:\Backup'

    Finally, you would run something like this to import the bacpac to the target VM.


    $bacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
    $sourceDatabaseName = "AxDB_Source_$(Get-Date -UFormat "%y%m%d%H%M")"

    #Remove any old temp source DB
    Remove-D365Database -DatabaseName $sourceDatabaseName -Verbose

    # Import the bacpac to local SQL Server
    Import-D365Bacpac -ImportModeTier1 -BacpacFile $bacpacFile -NewDatabaseName $sourceDatabaseName -Verbose

    #Remove any old AxDB backup (if exists)
    Remove-D365Database -DatabaseName 'AxDB_original' -Verbose

    #Stop local environment components
    Stop-D365Environment -All

    #Switch AxDB with source DB
    Switch-D365ActiveDatabase -DatabaseName 'AxDB' -NewDatabaseName $sourceDatabaseName -Verbose

    Start-D365Environment -All

    Isn't that neat? Now you have a way, while we're waiting for this to be completely supported out of the box in LCS - fingers crossed!

    0 0

    Suppose, we have just registered an application in Azure Active Directory and trying to acquire the token and get the below error Microsoft.IdentityModel.Clients.ActiveDirectory.AdalServiceException: ‘AADSTS65001...(read more)

    0 0

    //In order to not show this to the user you will need to modify \Classes\SrsReportRunPrinter\toFile() static void Sample_AutomaticExportReportToPDF(Args _args) { FromTime startTime = timeNow(); FromTime...(read more)

    0 0

    Hi Everyone, Today i am going to discuss about the Microsoft Dynamics 365 Field Service Mobile App, where can we download, compatibility. Field Service Mobile Options : As we know Field Service Mobile...(read more)

    0 0
  • 11/17/18--10:17: StrSplit in ax 2012
  • static void StrSplit(Args _args) { str test = 'SO-000493/VIN12345678987657'; List _list = new List(Types::String); container packedList; ListIterator iterator; int i; _list = Global::strSplit...(read more)

    0 0

    I recently gave MB2-716 Microsoft exam from home and lot of people asked me how did I arrange my home and workstation for the exam. Preparing for exam itself is one thing and preparing your workstation...(read more)

    0 0

    I would not call it sneaky, but sometimes when I find the Dynamics 365 CE UI or behaviour has changed slightly, I can attribute it to some update that was applied to the environment. There are email notifications...(read more)

older | 1 | .... | 1121 | 1122 | (Page 1123) | 1124 | 1125 | .... | 1174 | newer