Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Showcase


Channel Catalog


Channel Description:

A one stop shop where the Microsoft Dynamics ecosystem can learn, share, connect and network with others within the Community. Peer to Peer discussions , product demonstrations, blogs & videos.

older | 1 | .... | 1158 | 1159 | (Page 1160) | 1161 | 1162 | .... | 1174 | newer

    0 0

    (Please visit the site to view this video) Discover how to gain a competitive edge by staying on top of market trends as they unfold. In this webinar, Senior Product Marketing Manager, Anantha Ramachandran...(read more)

    0 0

    application

    The Customer Engagement Application, formerly Microsoft Portals (and before that, ADXStudio Portals) is a Software-As-Service (SAS) web application run by Microsoft in the Cloud. It utilizes Microsoft CRM as the persistence layer for business data and application metadata. The latter defines the web application’s look, feel, behavior, and restrictions. Deployment of such Portals in version-controlled scenarios may present substantial challenges that prevent efficient testing of the target environments and make it problematic to roll back any deployed changes. In today’s blog, we will review the issues, available tools, and options to solve the task, as well as detail the methodology of achieving positive results.

    Let’s start by discussing the challenges. Typical Portal deployment consists of three steps:

    1. Deployment of Portal solutions into the target CRM
    2. Deployment of business customizations related to the Portal (entities, forms, views, option sets)
    3. Transfer of the data stored in Portal-related entities

    In the typical scenario, Microsoft provides step #1 by installing a new Portal into the target CRM instance. Once this is done, Microsoft also populates Portal configuration entities with sample data and makes a Portal available for modification. This approach assumes a single production environment and does not imply any development anywhere other than the target CRM. Such development may involve business customizations to the CRM objects, as well as modification and expansion of the Portal configuration data.

    In the development workflow, where any business customizations are done in DEV environment and then ported to the target PROD or QA CRM, Step #2 above is managed consistently and presents no challenge. However, in case when both source and target CRM environments are bound with appropriate SAS Portal applications, any development related to the source Portal represents data that requires transfer to the target CRM, presenting the following challenges:

    • Both source and target CRMs may have Portals with the same name and any data transfer may create collision and confusion in the names of the data elements
    • Portal-dependent child records in both CRMs may have the same unique identifiers (GUIDs) – in those cases when one environment was procured as the clone of another and any data transfer from the source to the target will respect the update and insert operations but not deletion
    • In case of existing target Portal data, there will be no option to roll back the deployment in case of any issues
    • There will be no option to create a new instance of Portal record in the target CRM when the record with matching GUID exists (true for any dependent child records, as well)

    Now, let’s discuss our options. Presently, there are only two tools available to transfer Portal data:

    • XRMToolbox plugin for Portal transfer – this uses proprietary data encoding format and inconsistent data schema with several compatibility issues; it allows for updating only existing Portal records in the target environment. When the target Portal is not selected and the GUIDs of the source and target data sets match, it produces unusable results. Also, this tool has issues with transferring M:M-related data, which makes it out-of-scope for Portal deployment.
    • CRM Configuration Migration Tool – provided by CRM SDK – is a universal tool that may properly handle linked data transfers and M:M relationship migration in several iterations. This tool exports data according to the selected schema (which can be easily built for the current version of the Portal) and imports it as is. In those cases when GUIDs in the source and target match, the update is executed. There are extensive options for updates but there is no way to force it to create new records or delete obsolete ones. This is the only viable tool for our purposes, so we will see how it can be used to attain our objective.

    Next, let’s examine our solution. The happy path for Portal data deployment must follow this sequence of events:

    1. In the target environment, the Portal record is renamed for archival purposes and to avoid name collision with the new Portal.
    2. The source data is extracted and stored into a file.
    3. In the source data file, all GUIDs identifying Portal records are replaced with the new ones to avoid collision with any of the target entities.
    4. The modified source data is imported to the target CRM – the new Portal record and dependent data set is created.
    5. In the target environment, the Portal SAS application is switched from the archived Portal record to the newly imported one.
    6. If testing of the new Portal is failing, the SAS application may be switched back to the archived Portal record.

    Step #3 is a simple regular expression replacement of an existing found GUID. For example:

    [a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}

    However, we must not forget that many unique record identifiers may be referenced multiple times within the data source file (linked entities), and such repeating GUIDs must be replaced with the same new GUID as the first one met. A further complication is related to the fact that we want to replace only GUIDs defined as record IDs in the data file, not the GUIDs that are referenced only: this is because many records refer to the entities outside the scope of contents of the Portal schema.

    The solution is a simple command-line tool developed internally as a universal Reg-ex replacer with default patterns searching for record IDs, building a dictionary for those mapping them to the new GUIDs, then replacing all GUIDs in the file according to the dictionary. Those GUIDs not in the dictionary are not replaced.

    This tool had produced a data file that, when imported, creates all brand-new target Portal entities and all brand-new dependent entities, while any of the imported entities linked to existing data outside of the Portal schema stay properly linked.

    Since the new record set is created in the target environment, there is no issue with deleting obsolete configuration records: those stay linked to the archived versions of configurations and will be deleted automatically when the old Portal records are purged.

    The only insubstantial complication in the suggested deployment process is related to the contacts assigned with the web roles. We are not porting or updating the contacts during deployment. We cannot assume that the contacts in the target environment will match those from the source. So, the web role assignments will be lost if we attempt to transfer them. For the purpose of clean data transition, it is recommended to drop all web role assignments in the source environment prior to exporting the Portal data. Those assignments may be backed-up before and restored in the source later to maintain the ability to unit-test in DEV.

    In summary, with a simple command-line tool developed internally, we can use the standard CRM Configuration Migration application to transfer the Portal data from the source environment to the target in such a way that the new Portal record and dependent child records are created, making it available for SAS application to switch to. At the same time, the previous versions of the Portal records in the target environment are not destroyed or overwritten, thus making it possible to roll back the SAS assignation to any previous state of release.

    A special note: with the slight modification of the default search-replace regular expressions, it is possible to use our tool to condition the data file extracted by XRMToolbox plugin for the Portals. While the data transfer experiment was successful, it is still not recommended to use because of the internal issues found within this plugin, as well as its inability to migrate M:M relationships properly.

    Don’t forget to subscribe to our blog for more!

    Happy D365’ing!


    0 0

     Hi Readers,

    I hope you all are following this series and followed and completed what we have discussed till now.

    If you are new to this then Please Refer Table of Index.

    Last but not least in this article we will discuss how to develop a Report in AL.

    A customer would like to analyze Source of Sales based on Item Sold.

    Read Complete Article »

    0 0

    Learning Path an intuitive feature as compared to Customized Help. Customized Help will take you to an entirely different section whereas Learning Path will guide you through the application when you use...(read more)

    0 0

    I am about to take the MB2-719 certification, this certification covers the Dynamics 365 for Marketing Application. I plan to create a series of blog posts that collectively should help anyone else preparing...(read more)

    0 0

    Tested on: Dynamics 365 version 9.2, PSA solution version 3.3, Unified Interface Actuals are records that hold financial data about approved Time and Expense Entries, Milestones and Materials in Dynamics 365 Project Service Automation. Actuals are primarily created in the context of Sales (revenue) and Cost for Projects and Project Contracts, and are in most cases passed on to a financial system / ERP for invoicing and the general ledger. By design, Actuals can not be deleted as deleting...(read more)

    0 0

    During this entire year 2018 the Microsoft Dynamics ERP family has changed (and you know that for sure) and Microsoft has pushed a lot for helping customers and partners to “change their mind”...(read more)

    0 0
  • 12/29/18--09:05: ¿Finalizando el año?
  • ¡Qué año! No creo que pueda encontrar una frase que pueda resumir lo que fue este año, y es por eso que me siento a escribir este artículo. Este año ha sido el año con más desafíos y logros a nivel profesional...(read more)

    0 0

    DMM365: Data Migration Management tool for Dynamics 365 Given: Deploy Portal from Target to source environment Conditions: Test new version with the actual data with option of quick old version restore...(read more)

    0 0

    I am about to take the MB2-719 certification, this certification covers the Dynamics 365 for Marketing Application. I plan to create a series of blog posts that collectively should help anyone preparing...(read more)

    0 0

    Today in this post we will discuss about using Stream to Export or Import Files or in other words File handling using Streams. First Example I am discussing about is how to Import Image to Item Picture...(read more)

    0 0

    One of the most common asks as an administration is to know when the user started accessing the system and from where. In your Dynamics 365 Customer Engagement apps, you can enable Auditing for User...(read more)

    0 0

    Microsoft have created an excellent description of this in the Assortment management doc-page. Retail assortments are essential to define what products that should be available across retail channels. Assigning a product to an assortment, will assign the product to the stores that have the assortment. This makes it possible to sell the product in the store.

    But there is something missing, and that is to use assortments for replenishment and procurement. Retailers want the master planning to only suggest procurement and transfers on products that belongs to the stores assortments. You can control requirement parameters per warehouse, but there is no standard way to use the assortment to control the generation of planned orders.

    This blog post is to show how to make a very small extension that allows you to make sure that only products that belongs to a stores assortment will generate planned orders. The solution I will make use of is to look into how the product lifecycle state functionality is working, and extend this with an assortment planning parameter. I have called this parameter “Is lifecycle state active for assortment procurement

    What it will do, is to validate if a product is in the assortment of the store, and if it is, then the product will be requirement calculated and will generate planned orders. If the product is not in the assortment of the store, that no planned orders will be generated.

    To make this happen, I needed to create 4 extensions. The three first is adding a new field on the product lifecycle form.  For an experienced developer this is easy to create, and no need to spend time on in this this blog-post.

    The real fun is how to manage and control the planned order generation. This happens in in an extension to the ReqSetupDim class.  Here there is a lookup to the assortment on the product, and checks if the store have this assortment.  If not, then the masterplanning will not generate any planned orders. I therefore create an extention class and use the new method wrapping/CoC feature to add some additional code.

    ///
    <summary>
    /// Contains extension methods for the ReqSetupDim class.
    /// </summary>
    
    [ExtensionOf(classStr(ReqSetupDim))]
    final class ReqSetupDim_extension
    {
    
        ///
    <summary>
        /// Validates if a product should be assortment planned
        /// </summary>
    
        /// The parm of the ReqSetupDim class.
        /// false if the product is not assortment planned; otherwise, return default value.
        public boolean  mustReqBeCreated(InventDim _inventDimComplete)
        {
            Boolean ret = next mustReqBeCreated(_inventDimComplete);
    
            if (ret)
            {
                if (inventdim.InventLocationId)
                {
                    InventTable                 inventtable;
                    EcoResProductLifecycleState ecoResProductLifecycleState;
    
                    //Fetching fields from  inventtable
                    select firstonly ProductLifecycleStateId, Product from  inventtable where inventtable.ItemId == this.itemId();
    
                    //validating if the product is active for planning and that also assortment planning is enabled.
                    select firstonly RecId from ecoResProductLifecycleState
                            where   ecoResProductLifecycleState.IsActiveForAssortmentPlanning == true
                                &amp;&amp;  ecoResProductLifecycleState.IsActiveForPlanning == true
                                &amp;&amp;  ecoResProductLifecycleState.StateId == inventtable.ProductLifecycleStateId;
    
                    if(ecoResProductLifecycleState)
                    {
                        RetailStoreTable                    store;
                        EcoResProduct                       product;
                        RetailAssortmentLookup              assortmentLookupInclude;
                        RetailAssortmentLookup              assortmentLookupExclude;
    
                        RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupInclude;
                        RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupExclude;
    
                        //Finding OMOperatingUnitID from the inventlocationId
                        while select firstonly OMOperatingUnitID from store
                            where store.inventlocation == inventdim.InventLocationId
                        {
                            //Check if the product is in the assortment of the store in question
                            select RecId from product
                                where product.RecId == inventtable.product
                            exists join assortmentLookupInclude
                                where   assortmentLookupInclude.ProductId == product.RecId
                                    &amp;&amp;  assortmentLookupInclude.lineType == RetailAssortmentExcludeIncludeType::Include
                            exists join assortmentLookupChannelGroupInclude
                                    where   assortmentLookupChannelGroupInclude.OMOperatingUnitId == store.OMOperatingUnitID
                                        &amp;&amp;  assortmentLookupChannelGroupInclude.AssortmentId == assortmentLookupInclude.AssortmentId
                            notexists join assortmentLookupExclude
                                where   assortmentLookupExclude.ProductId == product.RecId
                                    &amp;&amp;  assortmentLookupExclude.lineType == RetailAssortmentExcludeIncludeType::Exclude
                            exists join assortmentLookupChannelGroupExclude
                                where   assortmentLookupChannelGroupExclude.OMOperatingUnitId == store.OMOperatingUnitID
                                    &amp;&amp;  assortmentLookupChannelGroupExclude.AssortmentId == assortmentLookupExclude.AssortmentId;
    
                            if (!product)
                            {
                                ret = false; //The product does NOT belong to the stores assortment, and should not be planned
                            }
                        }
                    }
                }
            }
            return ret;
        }
    }
    

    I also have code to restrict creation of manual purchase orders, and where simular code can be used, but let’s hope that Microsoft can further extend standard Dynamics 365 with assortment based procurement planning.

    Copy with pride, and let’s hope next year will give us 365 more opertunities.


    0 0

    create extension from markuptrans form , and copy event handler for initialized form [FormEventHandler(formStr(MarkupTrans), FormEventType::Initialized)] public static void MarkupTrans_OnInitialized(xFormRun...(read more)

    0 0

    This time last year I wrote my Top 3 themes of 2017 article on what were the major events and directions from the year for the Dynamics 365 ecosystem. The start of a brand new year always feels like the logical moment to reflect back on the past 365 days, so this sounds like a worthy tradition to keep going. Here are my Top 3 picks from 2018 and some thoughts on how they might influence the direction of the year 2019 ahead.

    Power Platform

    The biggest single announcement of 2018 came in March when the Dynamics 365 Customer Engagement and PowerApps platforms were merged into one. It wasn’t until July that we began to see the Power Platform term used in describing this new suite of tools that now is the way to extend both Dynamics 365 and Office 365 apps, as well as building brand new apps for customer specific scenarios.All of a sudden the technology that had been bubbling under in the Dynamics CRM corner room is now brought onto the main stage of MS business software show.

    The immediate impact was that XRM became CDS 2.0 (Common Data Service for Apps),which probably hasn’t been all that easy for non-Dynamics professionals to understand if they only paid attention to official MS information sources covering the topic. For the Dynamics partners a nice upside in this merger was PowerApps P2 becoming the “naked XRM” platform license they had been asking for many years (compared to the earlier Dynamics 365 Plan license for bundling CRM + ERP, which I don’t think was in as high demand).

    A more subtle but equally important change was the birth of model-driven app and canvas app concepts. No, not the marketing terms nor the division into two app types, rather the fact that these different client technologies now had a clear need to start approaching one another in terms of how they behave, what data sources they support and how they are administered. Examples of these have become visible through recent announcements like:

    It would be perfectly justified to call 2018 “the year of the platform”, considering how significantly the investments from MS side seem to have shifted from Dynamics 365 to the Power Platform. During 2019 we’ll see if the partner channel can follow along, to transform their offering into something more in line with the PowerApps story than the traditional CRM business models that have mostly been just revised for the cloud based environments during recent years.

    A similar challenge awaits the professionals who’ve been working in this business and now need to figure out how to put their existing skills into use in projects that may not even mention the Dynamics product name anywhere. Plenty of new skills will also need to be acquired for leveraging the broader toolkit. The recent announcement of Dynamics 365 exams certifications to be retired gives an indication of the looming new requirements that await the MCP’s wanting to remain current with their certification record.

    One Version

    My Nr. 2 theme from 2017 was the App/Plat separation that largely took place as part of version 9 release. Now that Dynamics 365 CE is running purely on Azure after all orgs get to v9, the next logical step is to start delivering new releases on it the same way a modern cloud native product would. PowerApps, Flow and Power BI have already been operating as a service with a single version for all customers and now the platform underneath Dynamics 365 as well as the Apps on top of it are set to transition into this model. The July announcement of how Microsoft plans to deliver predictable updates with continuous deployment for both Customer Engagement and Finance & Operations is another major event of 2018 that will shape the future of these product lines and introduce a new reality for customers who build their digital business processes on top of them. The old CDU process for version update scheduling is no more and everyone will get the April 2019 update bases on the public release schedule.

    Whereas the earlier April 2018 and October 2018 updates have felt a bit like a “preview” of how the current Microsoft product group would want to run things, the schedule and communication around April 2019 is starting to show for the first time what the new “business as usual” mode of operation for the MS Business Applications organization is. The roadmap site is back& extended to all products, giving customers a logical place to keep an eye on for information on upcoming features. Still, there are plenty of things to keep an eye on for how the 2019 plan rolls out. The coming update is supposed to be a major release that lifts everyone onto level 10 at once. How the new preview instance system satisfies the testing needs of customers and partners, what actual features of the April 2019 release train are delivered in that version change, how well does the opt-in process and “off by default” policy align with end user training needs. All this & more is going to make 1H of 2019 an exciting time period.

    The one version policy naturally doesn’t cover on-premises environments, where customers and partners are still tied into the traditional upgrade projects. In December 2018 the new bits for Dynamics 365 CE Server were finally made available, showing that Microsoft hasn’t stopped developing the on-premises product. There were two years between the latest on-prem releases (v8.2 to v9.0.2), so it would be risky to place any bets on when & how the next release will be delivered. 2019 will see many customers working their way to get to V9, only to be less behind the cloud platform version than before, but those without regulatory reasons to keep running their own servers will hopefully be focusing their energy on planning how to move to the cloud based Power Platform instead.

    AI Journey

    If the two other top themes have had a fairly big impact on the short term already, then the third place goes to more of a long term initiative that didn’t yet change the lives of Dynamics 365 customers and partners in 2018. However, given how obviously massive the investments from Microsoft are in this area, I think it’s perfectly justified to mark the year 2018 as the start of the commercial AI journey when it relates to Dynamics 365. What Dynamics 365 is to Microsoft’s AI is exactly the same as what it is for other Azure services: the showcase of what these cool innovations from the R&D labs can do when positioned into real world business scenarios. The moment those consumption based Azure services targeted purely for developers start finding their way into user based Dynamics 365 product licenses aimed at business decision makes, that’s when you know the technology has moved a big step closer to mass market.

    OK, so the “AI” that software vendors refer to most often isn’t about creating true thinking machines to rival humans in their cognitive processes but rather more about taking the huge piles of data that digital service providers collect these days and applying some machine learning algorithms on top of it (example data sources being LinkedIn’s user network or internal MS Exchange data for organizations). Compare this new generation of software to traditional enterprise information systems that mainly collected data entered by information workers and produced some performance reports from it – then yes, there’s definitely a whole lot of intelligence in what guidance the modern apps can give to end users. The key difference being that while it technically may have been possible to construct such systems in the past, these type of intelligent features are now becoming the out-of-the-box experience that starts to define what the software product is actually perceived to be. Basic CRM systems have long been a commodity and the building blocks used in them are now actively being sold by MS as solutions for quickly developing any kind of business apps, in the form of Power Platform. The dedicated Dynamics 365 Apps for specific business processes will therefore need to become pretty darn intelligent to justify the additional cost over vanilla CDS and PowerApps.

    As a part of Satya Nadella’s announcement in March on MS organization changes, the Business Applications Group lead by James Phillips was given responsibility for “Business AI”. It doesn’t come as a surprise then that in Fall 2018 we saw the AI apps become an official category in the Dynamics 365 product portfolio. Separating parts of the Sales, Service and Marketing functionality from the core Dynamics 365 Enterprise Apps into brand new “AI for X” apps makes room for the software suite to further extend its reach into more specific user groups as well as keep growing the stack higher into new price ranges beyond Enterprise. In 2019 it’s going to be interesting to see what the response and adoption from the customer side is going to be for these new products: is this the “AI as a service” that companies using Dynamics 365 are looking for? At least now there will be more concrete examples to start this AI discussion and for MS to gain feedback on which capabilities have broad enough demand on the market to justify building them into ready-made apps instead of raw APIs.

    The AI journey does also touch the analytics side of the business, in the sense that in 2018 we saw a broad push of Azure Data Lake as the place where companies should be storing their data from MS and non-MS systems. The original Common Data Service for Analytics announcement from March turned into a silent rebranding of this “other CDS” into Power BI dataflows, which reached a preview stage just before the year’s end. It’s obvious that AI applications will need a lot more data to crunch than what the relational CDS for Apps databases normally contain, so expect to see the Azure Data Lake leveraged in many of the scenarios that MS will be promoting to their Dynamics 365 customers in 2019.

    The post Top 3 Themes for Dynamics 365 in 2018 appeared first on Surviving CRM.


    0 0
  • 12/30/18--15:38: CodinGame Contests Meet F#
  • This post is part of the F# Advent Calendar 2018 organized by Sergey Tihon . Introduction If you know me, it’s no secret that I’m a big fan of the CodinGame.com site. I even...(read more)

    0 0

    If you were hoping to read (after long time) a technical post from me, I am sorry but you need to wait a bit longer :)
    In this post I will write about Dynamics Weekly (first part) and what I have been up to during 2018 (second part), let's start.

    Dynamics Weekly is my newsletter regarding Dynamics 365 CE, I started it in the end of 2017 and it's going well.

    A chart of the subscribers: (click to zoom)

    This chart shows 2 things:

    1. There is a steady increase of the subscribers, that's nice
    2. I am terrible at promoting the newsletter, because except the initial grow (maybe due to my spamming on Twitter and LinkedIn) the final count is relatively low
    I want to tell how Dynamics Weekly started. Back in December 2017 I realized that due to my commitments (both work and personal) I was not able to keep track of all the news, Dynamics is my work but I also like it and it's pity if I miss the cool stuff.
    I already knew that the 2018 would have been the same, so I needed a way to force myself to check the Dynamics world. A weekly newsletter is a good way to summarize the content published during the last days, also having the email in my inbox allows me to search for specific posts that I vaguely remember just using few keywords.

    Most of the time I am able to collect the newsletter material during the week, sometimes I do the Sunday evening before the send. But Keep Dynamics Weekly running is useful to me and to the subscribers, in the last year I received many messages of appreciation, thanks to all of you.
    Of course Dynamics Weekly would not exist without all the contributors in the Dynamics communities, I am honored to know some of them and the time and energy they spend creating all kind of content is way more important than mine spent on preparing the newsletter.

    And now the second (brief) part of this post. How was my 2018? to be honest not great, it really drained me of energy and the final reward was very small. But I met new friends, I learned new things (not related to Dynamics) and I am positive 2019 will be a better year.

    PS: 19 January I will join the "Dynamics Power! 365 Saturday London" event, see you there.

    Thanks!

    0 0

    DMM365: Data Migration Management tool for Dynamics 365 Given: Deploy specific Webform Steps and and specific Webform Metadata. Conditions: modified since last deployment on 31 of August for Portal X exclude...(read more)

    0 0

    Portal 365/Adxstudio portal deployment short review This article reviews how modern crm portals can be quickly copied or deployed for free. That is why Scribe, Kingswaysoft, Cobalt and other toll services...(read more)

    0 0

     Hi Readers,

    I hope you all are following this series and learned something new with this series.

    If you haven't read any article in this series then Please Refer Table of Index.



    Read Complete Article »

older | 1 | .... | 1158 | 1159 | (Page 1160) | 1161 | 1162 | .... | 1174 | newer