Archive for the ‘Microsoft Dynamics Ax’ category

Dynamics 365 Project Operations

September 13th, 2021

Dynamics 365 Project Operations will soon be one year since GA and more features will come next moth in the Wave 2 update. It is a combination of previous solutions (Project Management and Accounting feature of Dynamics 365 Finance and Supply Chain, Dynamics 365 Project Service Automation, and Microsoft Project) into a single set of user scenarios typical of project-centric businesses.

The long-term approach to the database back-end for Dynamics 365 Project Operations is he Dataverse. (formerly known as the Common Data Service) which consists of foundational, secure data entities to enable standard, mainstream business use cases with reusable business logic. 

Project Management and Accounting development work will continue on X++ while Project Service Automation solutions will still be built on the Power Platform.

IFRS 17 – compliance accelerator system- ask Synergy Software Systems.

September 1st, 2021

IFRS 17 is the newest IFRS standard for insurance contracts and replaces IFRS 4 on January 1st 2022. It states which insurance contracts items should by on the balance and the profit and loss account of an insurance company, how to measure these items and how to present and disclose this information.

This is a big change for insurance companies and data administration, financial presentation and actuarial calculations will need to change.

 

Why are IFRS 9 and IFRS 17 implemented together?

  • The insurance liability (IFRS 17) is always closely connected to the financial instruments (IFRS 9) within insurers.
  • When a client buys an insurance, the insurance liability is created and with the paid premiums are financial instruments bought.
  • Insurers want to reduce the volatility in their earnings and there are some choices within IFRS 9 and IFRS 17 which they can make which can impact the volatility.
  • Under IFRS 17 insurers can decide whether results of changing financial risk assumption go through OCI or through the profit and loss account.
  • Under IFRS 9 insurers can decide whether changes in equity will go through profit and loss or through OCI.

Both standards will impact earning volatility and hence balance sheet management choices are connected. Consequently, the IFRS board decided it is better that insurers are granted the option to implement both standards together.

IFRS 9 explains the classification and the measurement of financial instruments. Hence IFRS 9 helps to improve the information disclosure around financial instrument. Many perceive the information disclosure around financial instruments during the financial crisis as inaccurate for example impairments on financial instruments were taken too late and the amounts were too little.

IFRS 9 makes the classification of each financial instrument more logical and principle based. There are two questions which need to be answered for the classification:

  • Why is the company holding the asset; just for collecting the cash flows from the underlying asset, or is the asset also held for trading?
  • What kind of asset is the financial asset? Is it a derivative, an equity or a debt instrument? With the SPPI (solely payment of principal and interest) model it can be tested whether an instrument is really a debt instrument.

 The classification determines:

  • which accounting principle is used;
  • should the instrument be measured at fair value or at amortized cost
  • and whether earnings and losses should go through the profit and loss account or through the OCI (other comprehensive income) account.

IFRS 9 also includes a more dynamic credit loss model instructing when an insurer should take an impairment on financial assets. The model is forward looking thereby also expected future losses should be taken into account with the impairment.

 IFRS 9 also makes hedge accounting possibilities more rule based, thereby being in line with how risks are  managed within insurers.

Why does this matter?

There is a huge impact on insurers and a big change in the disclosure.

  • Almost all of the asset and liability side is hit by the combination of IFRS 9 and IFRS 17.
  • New concepts and terms are introduced.
  • The standards will impact the presented numbers. Under IFRS 17 the insurance liability needs to be based on updated assumptions which is not currently a requirement. .
  • More data with more granularity and more history will challenge internal data storage, reporting and IT performance.
  • Reporting timelines are shortened, which will challenge the systems, and the cooperation between different departments.
  • New components like the unbiased Cash Flows, Risk Adjustment, Discount Rate and CSM are introduced. This means the insurer needs to understand the IFRS 17 principles and decide how to implement IFRS 17. For example which measurement model to choose for an insurance product, which transition measure to user. Read here more about the IFRS 17 model, and here about the transition period.
  • In the balance and income statement, insurance liability will n be specified in a different way, the importance of gross written premiums will disappear, while equity will be impacted.
  • The presentation of the balance and P&L are also significantly affected.
  • Risk engines are needed to calculate the CSM and cope with all the different groups
  • Insurers need to disclose information bases on group of contracts.
  • A group is a managed group (often a product) of contracts which were all profitable, onerous, or may become onerous (decided at inception) with a certain inception year. Insurance companies can have hundreds of groups and IFRS 17 insists on this grouping to have more transparency as insurance companies cannot offset the result of one group to another

Synergy Software Systems has been implementing and supporting financial solutions in the insurance vertical for 25 years. If you need to rapidly implement a solution for IFRS 17 compliance that will sit alongside your existing erp and finance systems then call us on 0097143365589.

August 24th, 2021

A bungled data migration of a network drive caused the deletion of 22 terabytes of information from Dallas Police Department police force’s systems – included case files in a murder trial,during a data migration exercise carried out at the end of the 2020-21 financial year

“On August 6, 2021, the Dallas Police Department (DPD) and City of Dallas Information and Technology Services Department (ITS) informed the administration of this Office that in April 2021, the City discovered that multiple terabytes of DPD data had been deleted during a data migration of a DPD network drive,” said a statement [PDF] from the Dallas County prosecutor’s office.

14TB were recovered, presumably from backups, but “approximately 8 Terabytes remain missing and are believed to be unrecoverable.”

The Home Office initially issued a statement saying the data loss was down to a “technical issue”, which had been resolved, There must have been some technical resolution because the Home Office later said it was not a technical issue after all, and in fact a “housekeeping error” with Home Secretary Priti Patel saying: “Home Office engineers continue to work to restore data lost as a result of human error during a routine housekeeping process earlier this week.”

In a letter published by The Guardian, National Police Chiefs’ Council (NPCC) deputy chief constable Naveed Malik, lead for the organisation on the Police National Computer (PNC), said approximately 213,000 offence records, 175,000 arrest records and 15,000 person records had potentially been deleted in error. The DNA database connected to the PNC saw 26,000 records corresponding to 21,710 subjects potentially deleted in error, “including records previously marked for indefinite retention following conviction of serious offences”. The letter also said 30,000 fingerprint records and 600 subject records may have been deleted in error.

The PNC dates back to the 1970s. The current iteration is a Fujitsu BS2000/OSD SE700-30 mainframe based in a Hendon data centre, running Software AG’s natural programming language-using ADABAS database. The UK’s territorial and regional police forces, Serious Fraud Office, Security and Secret Intelligence Services (MI5, MI6), HM Revenue & Customs, and the National Crime Agency all make use of it. They have controlled and 24-hour access from remote terminals and through local police force systems.

These incidents highlight the importance of backups and backup and recovery processes. How often do you test whether you can restore your back ups? Does this still work for restoring older back ups when you upgrade? Has a move to the cloud changed the retention of your back ups, the frequency of upgrades, or the ease or time for restore?

e-invoicing in KSA and Dubai – does your system meet the requirements? Ask Synergy Software Systems.

July 16th, 2021

The Kingdom of Saudi Arabia (KSA). The Kingdom announced e-invoicing for resident companies, which was published on December 4, 2020. e-invoicing will become mandatory for tax payers from December 4, 2021.

The aims of the e-invoicing mandate are to provide more transparency, and enhance consumer protection and anothee benefit of e-invoicing implementation is the readability of the invoice formats

. Companies registered in Saudi Arabia should immediately start updating or changing their systems and processes to support issuance of e-invoices. This may be a little challenging. However, the key to successful implementation is to start early.

Note that Dubai has also announced similar legislation.

If you need to upgrade or change your system or to .add additional functionality to your systems to comply with the invoicing mandate then please contact us 009714336589

On this blessed occasion of Eid, we wish you and your family good health, wealth and prosperity. And don’t forget to take a reflection on you and your business this Summer.

Synergy Microsoft Gold partnership

June 23rd, 2021

Our Microsoft Partner Network Gold competency membership has been confirmed again for the 15th time.

Quickly identify and fix your performance bottleneck

May 4th, 2021

Are you responsible for a busy SQL server, for example, the Finance Department’s systems, documentation management, CRM, BI, or a Web Server; perhaps a busy file and print server, or something else entirely.

Were you responsible for installing the application running the workload for your company? Is the workload business critical, i.e. TOO BIG TO FAIL?

Do users, or even worse, customers, complain about performance?

If you are responsible to keep the workloads running in your organization that would benefit from additional performance, please read on – even if you don’t consider yourself a “Techie”.

Windows and VMs are both factors of high latency that impacts performance.

Variables Affecting the Performance of the Applications

There are many variables that affect the performance of those applications. The slowest, i.e. the most restrictive of these is the “Bottleneck”. Think of water being poured from a bottle. The water can only flow as fast as the neck of the bottle, the ‘slowest’ part of the bottle.

In a computer hardware the bottleneck will almost always fit into one of the following categories:

  • CPU
  • DISK
  • MEMORY
  • NETWORK

With Windows, it is usually very easy to find out which one the bottleneck is in, and here is how to do it (like an IT Engineer):

  • To open Resource Monitor – click the Start menu, and type “resource monitor”, and press Enter. Microsoft includes this as part of the Windows operating system and it is already installed.
  • Notice the graphs in the right-hand pane. When your computer is running at peak load, or users are complaining about performance, which of the graphs are ‘maxing out’? This is a great indicator of where your workload’s bottleneck is to be found.
Resource monitor

What You Can Do to Improve Application Performance

Once you have identified your bottleneck – the slowest part of your ‘compute environment’ then, what can you do to improve it?

The traditional approach to solving computer performance issues is to throw bigger and more powerful hardware at the solution like an extra disk or a new laptop, or putting more RAM into your workstation, or on the more extreme end, buying new servers or expensive storage solutions.

How do you decide when it is appropriate to spend money on new or additional hardware, and when it isn’t. Well the obvious answer is; ‘when you can get the performance that you need’, with the existing hardware infrastructure that you have already bought.

You don’t replace your car, just because it needs a service or tuning?

Let’s take disk speed as an example. Look at the response time column in Resource Monitor. Open the monitor to full screen or large enough to see the data. On the Overview tab, open the Disk Activity section so that you can see the Response Time column.

Do it now on the computer you’re using to read this. (You didn’t close Resource Monitor yet, did you?) This shows the Disk Response Time, or , how long is the storage taking to read and write data? Of course, a slower disk speed = a slower performance, but what is considered a good disk speed or a bad speed?

Scott Lowe, has written a great post that you can read here…TechRepublic: Use Resource Monitor to monitor storage performance that perfectly describes what to expect from faster and slower Disk Response Times:

Response Time (ms). Disk response time in milliseconds. For this metric, a lower number is definitely better; in general, anything less than 10 ms is considered good performance. If you occasionally go beyond 10 ms, you should be okay, but if the system is consistently waiting more than 20 ms for response from the storage, then you may have a problem that needs attention, and it’s likely that users will notice performance degradation. At 50 ms and greater, the problem is serious.”

I hope when you check on your computer, the Disk Response Time is below 20 milliseconds. What about those other workloads that you were thinking about earlier. What’s the Disk Response Times on that busy SQL server, the CRM or BI platform, or those Windows servers that the users complain about?

Your Two Options

When the Disk Response Times are often higher than 20 milliseconds, and you need to improve the application performance, then it’s choice time and there are two main options:

  • Storage workload reduction software like DymaxIO™ fast data (Diskeeper®, SSDkeeper®, and V-locity® are now new DymaxIO fast data software). This tool will reduce Disk Storage Times by allowing much e of the data that your applications need to read, to come from a RAM cache, rather than be read slower disk storage. RAM is much faster than the media in your disk storage.
  • Contact us to trial this. You don’t even need to reboot.
  • If you have tried the DymaxIO software, and you still need faster disk access, then, it’s time to start getting quotations for new hardware. It does make sense though, to take a couple of minutes to install DymaxIO first, to see if that can be avoided. The software solution to remove storage inefficiencies is typically a much more cost-effective solution than having to buy hardware! A software solution to a software problem.

Improve Your Application Performance by Decreasing Disk Latency like an IT Engineer – call us to learn more 0097143365589

The New Dynamics 365 Project Operations – ask Synergy Software Systems, Dubai

January 16th, 2021

Almost a year ago, Muhammad Alam, Corporate Vice President Dynamics 365, shared the vision for a better product for project and service based businesses and industries. In March, Gurkan Salk was named the new General Manager for Project Operations at Microsoft.

Users of Dynamics 365 Project Operations often ask for a better way to collaborate in Microsoft Teams, and a new app experience has now arrived. (December 2020)

There are many ways in which Teams can be used to boost collaboration and efficiency while reducing reliance on email.
Add the Dynamics 365 App to Teams and use Project Operations inside Teams.
There is no need to step out of Teams for anything related to Dynamics work.
This app in general is for all the Dynamics 365 apps built upon Dataverse and the Power Platform, be it the Sales app, Customer Service, Project Operations, or others.
The benefit of working with Project Operation within Teams is the improved collaboration as project execution is in process. Many of the workflows of a typical project require constant email correspondence. That work can now be done via Teams, keeping inboxes clean and ensuring the right people stay informed.

What is the true cost of software development?

January 9th, 2021

There ahs been much talk of both devops and citizen developers.
While these new paradigms are welcome and bring many benefits that does not mean that they replace other proven systems of software development.

There are reason why some consultancies quote significantly lower times to develop than other- usually tis lack of knowledge/awareness of what needs to be considered or they deliberately cut corners in areas like security, validation, documentation, testing, and so on.

If that sounds harsh then take a look a this recent post:
A report published last week by the Consortium for Information & Software Quality (CISQ) estimates poor software quality collectively cost companies in the U.S. an estimated $2.08 trillion in 2020.

Ransomware that is Devastating MySQL Servers – be aware

December 29th, 2020

PLEASE_READ_ME is an active ransomware campaign that has been targeting MySQL database servers and dates back to at least the start of this year. The attack chain is extremely simple and exploits weak credentials on internet-facing MySQL servers. There are close to 5M internet-facing MySQL servers worldwide.

MySQL servers have often been used as a low cost alternative for applications like Dynamics Ax Retail store databases.

250,000 databases are offered for sale in the attackers’ dashboard, from 83,000 successfully-breached victims.

If you are using MySQl databases then we strongly recommend that you immediately review your credentials security and reference the link above.

Endpoint security against cybercrime – 7 key questions

December 17th, 2020

7 Vital Questions to Ask

Endpoint security has never been more important, more complex—or more challenging— than it is today. Given the multitude of solutions and of vendors , it is very difficult to sort through all of the competing claims to find what’s truly effective.

1. Will this solution run on all the devices in my environment?
2. How long will deployment take?
3. What will the members of my team need to know or learn in order to work with this platform
4. What types of preventative controls are in place?
5.From where does the vendor get its threat intelligence?
6. How does this solution integrate with incident response workflows? 7 Is 24×7 professional support available from the vendor?
7. Can this solution be integrated with other security services, products, or platforms from the same vendor to reduce costs and complexity?

Why Comodo?- Zero Percent Infection and Breaches for Customers

Comodo offers the only cybersecurity that stops undetectable threats.
Cloud-native cybersecurity with auto-containment stops ro-day threats that AI, ML, & other technologies miss.


Historical s scores and statistics from millions of endpoints on thousands of different networks of enterprise customers. It shows zero percent infection and breaches.

With Comodo you can “Protect without Detection.” The cloud-native framework delivers you zero day protection against undetectable threats while defending your endpoints from known threat signatures. Automatic signature updates simplifies deployment across your entire environment to lower operational costs

Contact us about Advanced Endpoint Protection 0097143365589