DataMart/BI/ for Monthly Reporting

What do folks here use for monthly snapshots of production data?

Our supplychain folks are looking to implement a few KPI’s based on month to month. Currently their process is to export a dashboard on the first of the month to excel. Copy and paste into another spreadsheet and then add the month/year. When it was just one spreadsheet, this was not a big deal. Now there are 5-6 spreadsheets, and more people working in them. They have found a few issues with this; folks are forgetting to do this right away at the beginning of the month, data is being changed, and the size of the worksheet is getting to be out of control.

IT to the rescue right… Here I am looking for a simple solution. My current thought is to run a stored procedure that on the 1st of the month. This will move the datasource from excel to a datatable in the database, adding a bit more control to the data.

Is this a common way to solve this issue? I know there are a few newer Epicor solutions, just think they are overkill for what we are trying to do.

Any guidance would be appreciated.

Thanks in Advance

I thought with SSRS you could save snapshots of a report, I am not sure the data gets saved as well. Does anyone else know?

For things like this I use SSRS. I create a subscription that emails the report to whoever needs it, individual or dist. group. That way the data stays static but can always be viewed live from report manager. I have yet to use snapshot, but i believe it will just save a copy of itself in report history to be viewed/downloaded later.

Edit: just tested with store all report snapshots in history and it put it right in report history.

If you want to keep the process within Epicor… You can always run a scheduled BAQ export. It can just dump the data to a folder on a schedule.

1 Like

Hey Ken,

Since it’s a dashboard, you have a BAQ. If you have REST enabled, you can modify this PowerShell script to run on a schedule.

function Get-BasicAuthCreds {
    param([string]$Username,[string]$Password)
    $AuthString = "{0}:{1}" -f $Username,$Password
    $AuthBytes  = [System.Text.Encoding]::Ascii.GetBytes($AuthString)
    return [Convert]::ToBase64String($AuthBytes)
}

$BasicCreds = Get-BasicAuthCreds -Username "username" -Password "password"
$EpicorURI = "https://yourserver/ERP10/api/v1/BaqSvc/YourBAQ?parameterName=parmvalue"
$response = Invoke-RestMethod -Uri $EpicorURI -Headers @{"Authorization"="Basic $BasicCreds"}

$response.value  | ConvertTo-Csv -NoTypeInformation | Out-File -FilePath "c:\filelocation\data.csv"

In production, you really don’t want passwords hard-coded but you could use Azure Key Vault or a local solution to encrypt/decrypt the password.

Mark W.

2 Likes

If we use SSRS then how would you set it up to compare month over month?

If we are on a schedule, wouldn’t it be better to just add the data to a table? I guess the location idea would be a first step… (the rest call or BAQ export) I was thinking of having the data be in one table long term, as opposed to one file per month.

PowerShell can write to SQL Tables too. Or you can load the CSV into SQL.

2 Likes

If they have Office 365 they can get PowerBI Premium and use Dataflows.

Dataflows are an ETL to Azure Data Lake that then let you use PowerBI to analyze and report on the data. Since they are web objects, you can embed the PowerBI dashboards directly into the new Active Homepages as widgets. You can also pull data from other services, such as Salesforce or Dynamics CRM or any data that you put into the Data Lake such as log files or sensor readings.

3 Likes

I haven’t tried out dataflows yet, but they sound interesting. Do you have to pay for the data lake? Have you embedded any power BI objects in your home pages?? @John_Mitchell

@utaylor It’s included in your PowerBI Pro or Premium license.

Yes, it’s just a matter of embedding a webpage in the Active Homepage. This can also be done with Tableau, Qlik, Periscope or even Epicor EDA.

1 Like

I have a separate DB on my SQL server as a small data warehouse. I use SSIS (included free as part of SQL Server) to run a script each night at 23:59 to copy over all PartBin tables to the Data Warehouse.

I then have an external query setup in Epicor, and a simple dashboard. Users can go back quickly and easily to a daily stock position for about 2 years currently and growing by the day.

As well as stock, doing the same for utilisation in qty of pallets of our chilled warehouse.