I have using Navision SP1 2009 version how to find out GL register in posted transactions SYSTEM DATE (not the posting date)
↧
Forum Post: Navision Dynamics - Posted Entries system date
↧
Forum Post: RE: How to print a title or caption in header of a report only in the page based on particular line item value?
Hi Suresh thanks, Could you please help me out in this ...as mentioned now i am able to get the no. of lines per page, could you please suggest how could i limit the no. of lines per page in report, further how do i implement my condition to print for specific line item value. Please advice i am stuck here.
↧
↧
Forum Post: RE: Navision Dynamics - Posted Entries system date
From Posted Transaction --> Navigate --> G/L Entry Take note the lowest G/L Entry No. Go to G/L Register , filter the "From Entry No." with G/L Entry No. , you should get the Creation Date of the entries.
↧
Blog Post: Business Central Booster Hackathon
Drum rolls please.. announcing location and date for our first Business Central Hackathon session, bringing professionals together to challenge the Business Central system. We challenge the common way of developing and ultimately we dare you to challenge... Read the full text.
↧
Forum Post: RE: Configure HMRC Vat Setup for NAV 2018 (GB Localization).
The issue is solved. It was occurring due to Wrong Secret Code in my case.
↧
↧
Forum Post: Data Exchange Framework in navision
Can someone please explain me how does the data exchange framework works in navision for xml import . Thank you
↧
Forum Post: RE: BC365 How to bypass the approval process on certain condition?
Hi Teddy, If i do it with one line i will get this error. ''This document can only be released when the approval process is complete". thanks.
↧
Forum Post: RE: Data Exchange Framework in navision
Please check below link if it helps you https://docs.microsoft.com/en-us/dynamics-nav-app/across-about-the-data-exchange-framework
↧
Blog Post: Business Central functionality for Russia
We are glad that it is happened! Awara IT is worried about fate of Business Central in Russia and ready to help to improve and develop this product. We have obsessed and competence team who is ready to share knowledges and make Business Central more popular and available for users. Last three month we posted bugs existed in different NAV version and as heritage in Business Central. Microsoft people reacted and now we have CU8 for BC. Thanks Alexey Finogenov and Jérome Lefevre for help! https://support.microsoft.com/en-us/help/4506821/cumulative-update-08-for-microsoft-dynamics-365-business-central-on-pr Also this month Microsoft published Russian localization for Business Central, Awara IT worked on this description several month and now we have it! Later this month description will be available in Russian. You can find in documentation not only standard objects description from previous NAV versions, some Russian processes and how to use specific documents were described. Thanks Eva Dupont and Andrey Panko to make this happen! https://docs.microsoft.com/en-us/dynamics365/business-central/localfunctionality/russia/russia-local-functionality Thank you very much Awara IT team for accumulating this experience and especially Alia Salikhova and Diana Malina for great work to contribute this knowledges!
↧
↧
Blog Post: Financial services: How to leverage data to strengthen customer loyalty
The landscape of modern banking has changed dramatically in the past decade. Financial institutions are facing intense competition not only from traditional competitors, but new cloud-based virtual banks that have appeared with recent technology advancements. Financial Service Institution (FSI) customers expect their institutions to deliver digital-forward, frictionless, and highly relevant experiences across multiple channels. With hundreds of options just a click away, customers are more selective than ever and they wont hesitate to leave if their needs arent being met or exceeded. According to a report by Qualtrics , nearly 70 percent of customers who left their financial institution cited poor service experiences as the reason. To navigate this competitive market, its essential for financial institutions to create the modern, personal experiences customers expect in order to secure their trust and loyalty and ultimately drive customer retention. Personalization is the new differentiator for FSI According to a 2016 benchmarking report by Personetics, only 31 percent of customers think their financial institutions know them and their needs well. This means that nearly a third of current customers are at least relatively dissatisfied with their banking experiences and at risk of churning. In order to strengthen their customer relationships and secure business versus the competition, financial institutions need to start providing their customers with the experiences they expect. In order to win client loyalty, a new integrated, and data-driven approach is necessary. One that provides the means to personalize the engagement and interaction for every client by: Utilizing a holistic client view to create personal experiences based on individual client financial goals. Leveraging multiple channels to naturally engage clients when and where its convenient for them, creating an authentic customer experience and delivering a seamless, relevant experience at all touchpoints. Using data analytics and machine learning to anticipate client needs and prevent problems before they arise. By leveraging client data in this way, financial institutions can create customized experiences that account for a clients individual history, household, circumstances, life events, preferred engagement channels, and more. Enabling intelligent service that keeps customers coming back By harnessing their client data to derive actionable insights, financial institutions can enable personalized engagements both online and in person through unique and situation-based offers, self-service capabilities, and intelligent customer service experiences. By empowering agents with a single, unified view of each client, they can provide customers with the fast, intelligent, and personal service they have come to expect. Most financial customers are goal-oriented, and they need help reaching those goals. With comprehensive data and insights into their customers individual needs and actions, institutions can provide the right offer or service to the right client at the right time, helping them achieve their goals. Machine learning can help derive even deeper patterns from customer data that can predict next-best-actions and make intelligent recommendations tailored to individual customer scenarios. From highlighting a new account feature based on a customers behaviors to suggesting a new local branch based on their location, personalized experiences and recommendations help financial institutions build the lasting relationship with clients that is necessary for success in todays competitive market. This is a relationship that ultimately results in greater client loyalty and increase of wallet share. Strengthening trust and reducing churn In a recent FIS Performance Against Customer Experience (PACE) report , customers reported safety and trust as their primary values of importance regarding their relationship with a financial institution, making it absolutely essential for financial institutions to include anomaly and fraud detection in their efforts to strengthen client loyalty and retention. By leveraging AI to analyze customer data in search of suspicious or unusual activity, financial institutions can identify and resolve potential issues even before they arise. Behavioral analysis also enables active churn prediction, identifying actions and behaviors that indicate a customer is about to leave. This provides an opportunity to re-engage those at-risk customers to address any dissatisfaction, as well as deliver personally tailored offers that can help win back the customers trust and avoid potential losses. How a customer data platform can help Financial institutions can collect both structured (e.g., client profiles, historical information) and unstructured (e.g., archived email correspondence, agent notes, etc.) client data across many systems and channels. Yet without the ability to effectively unify all of this data, it can be difficult or impossible to derive the holistic customer profiles and actionable insights necessary to personalize the customer experience and drive retention. With an enhanced customer data platform like Dynamics 365 Customer Insights, companies can bring together their customer data from all sources to gain a truly 360-degree view of their customers, unlocking insights that power personalized, authentic engagement at every touchpoint. 3 ways to learn more about Customer Insights Watch our webinar Personalized Experiences with a 360-Degree View of Your Customers for a closer look at how Customer Insights helps your marketing, sales, and service professionals tailor digital and one-to-one customer interactions at scale. Visit the Customer Insights website to explore the features, read customer stories, and view additional resources. Watch Microsoft CEO Satya Nadella and Head of Product Satish Thomas demonstrate Customer Insights: Watch Customer Insights video The post Financial services: How to leverage data to strengthen customer loyalty appeared first on Dynamics 365 Blog .
↧
Blog Post: The Dynamics 365 for Marketing July 2019 update is rolling out now
The July 2019 update of Dynamics 365 for Marketing is rolling out starting now! This update includes a few new features, plus plenty of performance and stability improvements. We’ll be rolling out the update on a region-by-region basis, and we expect it to be available to all regions by mid-July 2019. This update installs version 1.35.4825.0 of the Dynamics 365 for Marketing solution . Keep reading to learn how to get this update and which new features it includes. For more information about this product version, including known and recently fixed issues, see the Dynamics 365 for Marketing readme page . For more information about recently released and planned new features, please see the Dynamics 365 and Power Platform Release Plan . How to get the update To benefit from this update, you must apply it to each of your Dynamics 365 for Marketing instances manually. As soon as the update is available in your region, you’ll be able to see and apply it as described in Keep Marketing up to date . Dynamics 365 for Marketing receives regular incremental updates (released every month or so) and major updates (released about twice a year). The most recent major update was released in April 2019. Incremental updates (including this release) must be applied manually between major releases. Each update includes all previous updates, so you will always move up to the current version when you update manually. Major updates can be applied manually any time after they are released, but Microsoft will eventually apply the most recent major update for you to make sure all of our customers are running the minimum supported version. Updates applied by Microsoft provide the oldest supported major release only, which includes all previously released updates, but not subsequently released updates (even if they are already available at the time of the update). New installs and trials will always get the most recent incremental update available in their region. Prevent sending emails from unauthorized domains Domain authentication with DKIM is an increasingly important part of making sure your messages land in recipients’ inboxes rather than getting filtered away as junk. It requires that the from-address for each message you send shows a domain that you’ve authenticated for DKIM. Microsoft is dedicated to helping our customers achieve maximum email deliverability, so we’ve added a few features to help make sure you don’t overlook or inadvertently work around your DKIM setup: The error check for email messages will show a warning if you try to go live with an email message that has a from-address not associated with any of your DKIM domains. You can now set a default sending domain for your instance. If this is set, then the from-address for all of your email messages will automatically be adjusted to show the default domain (if it initially uses a non-authenticated domain) each time you create a new email message or change the user shown in the From All new instances and trials will automatically authenticate the instance domain with DKIM and set that domain as the default sending domain for your instance. To set your default sending domain, go to Settings > Advanced settings > Marketing settings > Default marketing settings . Then open the marketing settings record here and go to the Marketing email tab to set the Default sending domain . More information: Authenticate your domains Accurate member counts for draft segments The segment designer already offered a feature where you could request an estimate for the size of your segment while still working in the draft state. However, this feature previously provided only a very rough estimate. With the July 2019 update, you’ll now receive a much more accurate estimatein nearly every case, you’ll get a fully accurate member count based on your current segment settings. The post The Dynamics 365 for Marketing July 2019 update is rolling out now appeared first on Dynamics 365 Blog .
↧
Forum Post: Dynamics NAV not clickable in Excel
Dear all, I hope someone can help, and I such a novice with all this... My Dynamics NAV is grayed out in Excel, so I can not update my reports.... https://filestore.community.support.microsoft.com/api/images/ebf70c5d-4685-4298-9ac1-2066109d97af?upload=true Thank you very much in advance !! val
↧
Forum Post: RE: Dynamics NAV not clickable in Excel
Have you exported the data to excel from NAV? once you export the data, that control will enable and you can update and refresh.
↧
↧
Forum Post: RE: Dynamics NAV not clickable in Excel
Thank you so much for answering me Suresh ! Unfortunately, is doesn't seem to be the issue, the data have been exported from NAV to excel. I just re-exported another database (just to check up), and it still doesn't make the trick :( v.
↧
Blog Post: Using Gmail Account with SMTP Setup with Dynamics Business Central / Dynamics NAV
Overview Not all customers uses Office 365 as their e-mail. The other popular option companies use is G Suite from Google. When you’re trying to setup the SMTP Mail using Gmail accounts in G Suite, you may encounter this error: The mail system returned the following error: “Failure sending mail. Unable to read data from the transport connection: net_io_connectionclosed.”. Resolution The problem is with how Google detects which application it deems as less secure. If you’re using an application it deems less secure, Google will refuse the connection. What you’ll need to do is change your settings on Gmail. On the Less Secure App access, you will need to turn this on. Conclusion Now when you go back to your Dynamics 365 Business Central (aka Dynamics NAV) application, you will be able to send the mail from the SMTP settings.
↧
Forum Post: RE: Navision Dynamics - Posted Entries system date
You can look at the system creation date by looking at G/L Register. You will need the first or last GL Entry No in the transaction.
↧
Forum Post: RE: Unable to delete sales lines
If that's the cases, do debug when you delete the sales line. You will find the problematic code. Look at the variable that causes the error and check the SecurityFiltering properties. Chance is that variable properties is set to Disallowed or Validated. You need to change it to Filtered to respect the Security Filter on the permission set.
↧
↧
Forum Post: RE: Dynamics NAV not clickable in Excel
Try to disable the add-in using File --> Options, and then re-enabled it again.
↧
Forum Post: RE: Unable to delete sales lines
Dear Sir, When I try to debug it I have find the below code where arrow is marked. Please suggest me what I need to add extra lines in this code.
↧
Blog Post: AL Automated Tests: How to Write Your First Test?
Like any other code, AL code can be flawed and break during refactoring. To prevent manual testing, each time you make the smallest change in code, it is necessary to write AL automated tests. AL automated tests are pivotal to ensure that your code works correctly after refactoring or addition of new functionality. In this article, we are going to talk more about how you should structure your AL automated tests instead of the code logic itself. Prerequisite If you are using Docker as a platform where to hold databases meant for development, you will first need to import Testing toolkit. This is necessary so that you would have access to Assert codeunit. Assert codeunit is used to set conditions that check whether your code works as intended or not. Follow these steps to import Testing Toolkit to your container: Open PowerShell Enter command “Import-TestToolkitToNavContainer ” Restart container or generate symbols any other way. Download new symbols to your AL solution. Start writing AL automated tests Now you need to write all AL automated tests in separate codeunit, and to be able to do that, you need to set codeunit’s subtype to Test . Each separate test needs to have an indication that it is a test. You can indicate that by writing [Test] before procedure. The rest of the codeunit follows the same structure as regular codeunit, except for Assert before the end of the procedure. Now you can declare Assert in local procedure variables or as a global variable. This code, inside procedure, is used to set up the test itself. What goes into it depends on what exactly you are testing. In this example, the code check’s whether autofill code works correctly. It does that by initiating new entry, entering a value to “CodeField” field and activating OnValidate trigger. Then with Assert it checks if the field, which should be filled, is filled. The text “Field is empty” shows only if the test fails. This is the code bit that this test is testing: You can also create procedures that are not tests. Just simply don’t indicate that it as a test. Run AL automated tests In order to run AL automated tests that you have written, you have to publish your application, then go to “Test Tool” page, upload all “Test” codeunits and run them from there. The easiest way to access this page is to configure your launch.json so that startup page would be 130401. When publishing your extension, be ready to be directed straight to the testing page, where you will need to upload all test codeunits. Complete this action by pressing Get test codeunits button: When uploading all codeunits, you can either run only selected ones or all of them at once. As you can see, next to failed test there is text, which you write in Assert . Additionally, all the changes made to the database, for example, new entries created, are deleted after all tests are done. How you should write AL automated tests Let’s start with codeunits themselves. A good practice is to dedicate the whole codeunit to one part of your solution. You can split AL automated tests to different codeunits by code functionality, making tests library easier understandable and maintainable. So, let’s say your solution creates new entries and post’s reports, then you create two different codeunits, one to test the functionality of entries creation and second to test posting functionality. Inside these codeunits, you create tests themselves. Each test name should have an informative name, meaning, it should talk for itself. For example, if this test checks if a solution actually creates a new empty entry, then the name of this test should look something like this CreateNewEmptyEntry_Test . Some names can get really long, but don’t worry about that, it’s better to have longer, more clear names rather than short ones. You should only put one Assert in one test. There’s no one stopping you from putting as many Asserts as you want, but that can be confusing since if at least one Assert fails, the whole test is marked as failed as well. In this case, it is best to go directly to tests and figure out what has failed and why it failed from Assert . This is the reason why it is important to have clear names, so in case of the failure, you could just look at the name and go straight to fixing it. Your AL automated tests have to be specific, it should only be testing a certain part of your code and nothing else. Also, it is important to mention that it is always better to test each part of the functionality. For example, let’s say we have a table with three fields and a function in codeunit, which creates a new entry and fills the fields with random data. We need to make a test to check this functionality so we have to create a codeunit for this whole functionality. The first test is to check if entry is created at all. Then we need a test to check whether a value was added for the first field. Finally, it is important to write a test to check if values were filled in the other two fields. Divide your test solution to the smallest possible parts, it has to cover as much code as possible. It is recommended to write more than one unit test for each part of the code covering as many scenarios as possible. That way you not just stop possible bugs appearing after refactoring, but you may find bugs that you did not expect. Your AL automated tests should be independent of any exterior programs or files. If it is not possible to avoid using an external file, try to find a way to create a dummy file inside the test and feed it as a real file to your solution. If your solution need’s to call for exterior services, create dummy services inside your test. If you want your AL automated tests to be the most efficient it can be, it’s a good idea to make many “hooks” to which your test could attach to. That way you could test a tiny part of code instead of running the whole program, just do not forget to test whether a value was added to a field or not. Hooks can be described as procedures that are global and can be called from a test itself. Another popular definition of hooks is extra procedures that call a procedure which you want to test. Conclusion Let’s repeat the most vital steps for test efficient writing: Create codeunit for each functionality Name your unit tests meaningfully. Divide your code functionality to small parts, as you can and test each part individually Keep one Assert per test Make your solution code easily accessible from the test. All in all, sticking to the steps mentioned above will help you to write codeunits with AL automated tests, that will last a lifetime for your solution. How can you tell that your test is good enough? If your test doesn’t need fixing each time you make a change in your code, you can tap yourself on a shoulder! Don’t forget, practice makes perfect, and if you don’t have time to practice enough, leave it to professionals. We are always ready to help with any type of Microsoft Dynamics NAV Upgrade / 365 Business Central Upgrade ! The post AL Automated Tests: How to Write Your First Test? appeared first on Simplanova .
↧