Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Summary

WYDOT informed ODE team that the Driver Alert location data was coming out as zeros. ODE team investigated the data files provided by WYDOT team and confirmed the inconsistency. Further investigation into the code base revealed that a bug was introduced when changes enhanced metadata.serialId  field to allow more effective auditing and tracking of data in order to detect missing and duplicate records. The defect was introduced in the logic that calculates the metadata and NOT the payload; there was no impact to the data payload at any time.

...

An update to the LogFileToAsn1CodecPublisher class was made to support easier troubleshooting of missing or duplicate records and was included in (PR #263, commit). This class takes the raw payload data and wraps it in metadata to create a distribution-ready message. In this class, the metadata object is created once and then updated for each message that passes through. Reusing the metadata object instead of recreating it is done for performance reasons: it reduces memory usage and improves message processing time. The update changed the logic of how this metadata was populated and a statement for updating the metadata was missed. Therefore the metadata was not updated between each message and instead reused with the same information, resulting in the duplicate location and timestamp fields populated in all metadata generated for records in the same data file. 

...

  • Wyoming: Wyoming has removed all of the ODE generated data from their database during the 12/3/2018 to 2/12/2019 period and has re-uploaded all of the original log files back through ODE and into data store. However due the the bsmSource issue, all BSM data need to be removed and log files be uploaded again.
  • DataHub: All affected Wyoming data with metadata.odeReceivedAt >= 12/3/2018 until dateOfBugFix AND metadata.recordGeneratedBy != TMC should be removed from S3 bucket. Also a notification message will be posted on DTG and Sandbox to alert users of the inconsistencies and the time frame of the impending correction.
  • SDC: All affected Wyoming data with metadata.odeReceivedAt >= 12/3/2018 until dateOfBugFix AND metadata.recordGeneratedBy != TMC should be removed from data lake / raw submissions bucket and the Data Warehouse. Also a notification message will be posted on SDC site to alert users of the inconsistencies and the time frame of the impending correction.
  • End Users: A notification message will be posted to inform the users of the inconsistencies and the time frame of the impending correction.

...

For SDC, ODE team will work with the SDC team to remove invalid data using mutually agreed methods to identify invalid data and generate a report of affected files. Again, analysis of this report and any additional checks required by the data store maintainer, will determine the scope of the invalid data and resulting deletions from data lake / raw submissions bucket and the Data Warehouse.

The following metrics will aid in assuring the integrity of the data deposited to SDC and DataHub during the bug time period.

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: TBD (should be less than Before)
  •  Result Before Cleanup3,601,067
  •  Result After Cleanup95,065
  •  Result Before Cleanup:

    BSM: 6,458,749

    TIM: 416,763

  •  Result After Cleanup: TBD (should be less than Before)

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: TBD (should be the same as Before)
  •  Result Before Cleanup95,065
  •  Result After Cleanup95,065
  •  Result Before Cleanup74,843
  •  Result After Cleanup: TBD (should be the same as Before)

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: should be ZERO
  •  Result Before Cleanup3,506,002
  •  Result After Cleanup: 0 (should be ZERO)
  •  Result Before Cleanup6,458,749
  •  Result After Cleanup: should be ZERO

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: should be ZERO
  •  Result Before Cleanup3,506,002
  •  Result After Cleanup: 0 (should be ZERO)
  •  Result Before Cleanup:

    BSM: 6,458,749

    TIM: 115,463


  •  Result After Cleanup: should be ZERO

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: TBD
  •  Result Before Cleanup1970-01-01T00:00:00Z
  •  Result After Cleanup2019-04-11T15:40:30.395Z
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: TBD

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: TBD
  •  Result Before Cleanup2019-04-11T15:45:58.766Z
  •  Result After Cleanup2019-04-11 15:40:30.395000
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: TBD

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: Should be zero
  •  Result Before Cleanup3,506,002
  •  Result After Cleanup: 0 (Should be zero)
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: Should be zero

...

WYDOTDataHubSDC
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: Should be zero
  •  Result Before Cleanup1,488
  •  Result After Cleanup: 0 (Should be zero)
  •  Result Before Cleanup: TBD
  •  Result After Cleanup: Should be zero

...

WYDOTDataHubSDC
  •  Result Before CleanupTBD (should be greater than or equal invalidRecordCount or removableRecordCount)
  •  Result After Cleanup: TBD (should be the same as Before Cleanup)
  •  Result Before Cleanup2,012,382 (should be greater than or equal invalidRecordCount or removableRecordCount)
  •  Result After Cleanup: 0
  •  Result Before CleanupTBD (should be greater than or equal invalidRecordCount or removableRecordCount)
  •  Result After Cleanup: TBD (should be the same as Before Cleanup)

...

WYDOTDataHubSDC
  •  Result Before Cleanup(should be greater than or equal invalidS3FileCount)
  •  Result After CleanupTBD (should be similar to Before Cleanup)
  •  Result Before Cleanup920 (should be greater than or equal invalidS3FileCount)
  •  Result After Cleanup0 (should be similar to Before Cleanup)
  •  Result Before Cleanup(should be greater than or equal invalidS3FileCount)
  •  Result After CleanupTBD (should be similar to Before Cleanup)

...


TaskDescriptionOwnerTarget  Completion DateActual Completion Date
1
  •  QA Validation Checklist and Scripts
ODE Will Incorporate a QA Checklist and Script into their QA process.

BAH

3/19/2019 (End of ODE Sprint 43)3/19/19
2
  •  bsmSource bug fix
ODE will fix the missing bsmSource defect and run the QA Validation checklist and Scripts to verify the and regression test. This fix will be on top of the new SDW feature implementation.BAH3/19/2019 3/19/19
3
  •  DataHub Contextual Validity Checker (Canary)
ODE will deploy an AWS lambda function to check for data inconsistencies as they arrive on DataHub. Such inconsistencies may include SerialId serial numbers grossly out of order or repeating, time stamps repeating, required fields missing or null, etc. These Lambda functions will be shared with the community on GitHub so SDC and WYDOT teams will also be able to utilize it for data validation. 
Jira Legacy
serverSystem JIRA
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1230
BAH4/3/20194/4/19
4
  •  ODE software release v1.0.5 
ODE will release the validated ODE software to start UAT testing by WYDOT. BAH3/19/20193/19/19
5
  •  WYDOT UAT
WYDOT will deploy jpo-ode-1.0.5 on their DEV server and perform UAT testing. The checklist and validation scripts will be available to them for test and validation.  SDC team will test in their test environment.WYDOT3/26/20193/25/19
6
  •  DataHub Sandbox folder re-structuring decision

This item is not related to metadata bug but for sake of coordination this task was inserted to assess whether it would be beneficial and more efficient to perform folder restructuring at the same time as data clean-up and before data re-upload. It was ultimately decided and approved by PO that folder restructuring adds unnecessary complexity to the clean-up process and schedule and is best to be deferred until after the data has been completely restored. See 

Jira Legacy
serverSystem JIRA
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1229
 and 
Jira Legacy
serverSystem JIRA
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyRDA-674
 for details.

Excerpt from approval email:


Finally, based on most recent discussions with you for clarity, we have determined that the folder structure changes that have been discussed will not impact the timing of the data cleanup process we have documented and planned with WYDOT, DataHub, and SDC.  But we will again coordinate with “Scrum of Scrums” to analyze, plan and execute any folder structure work after cleanup activities are completed.  After any folder structure change implementation work is completed, communications to data store end users will then be consolidated to include BOTH cleanup close out communications and any additional required notification for folder structure changes.


BAH3/27/2019
7
  •  Commencement of cleanup coordination activities

All teams, ODE, Wyoming, DataHub and SDC collaborate on approach to remove invalid data from repositories. Meeting with WYDOT to establish UAT completion timeline. 

Jira Legacy
serverSystem JIRA
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1226
 
Jira Legacy
serverSystem JIRA
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyRDA-667

ALLweek of 3/20/20193/25/19
8
  •  GO/ NO-GO FOR WYDOT PROD DEPLOYMENT


ODE, DataHub and SDC teams will verify that freshly arrived BSM, TIM and Driver Alert messages are correct and consistent. The validation checkers deployed to both systems should also verify data validity in DEV.


COORDINATION MEETING TO CONFIRM ALL TEAM ARE RECEIVING CLEAN DATA FROM WYDOT GO/ NO-GO FOR WYDOT PROD DEPLOYMENT


  •  Verify DataHub using the Lambda "Canary" function running in BOTH Test and Prod (reference task =
    Jira Legacy
    serverSystem JIRA
    columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
    serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
    keyRDA-668
    )
  •  Verify SDC using ODE Validator in BOTH Test and Prod (reference task = DOTAS-720)
  •  WYDOT
  •  DataHub
  •  SDC

4/5/2019

4/11/2019

4/11/2019
9
  •  WYDOT PROD deployment
UAT testing completed; Wyoming will promote v1.0.7 to PROD environment AFTER coordination meeting with stakeholders ref: 
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1211
 -→ 
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1200
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1202
WYDOT

4/5/2019

4/11/2019

4/11/2019
10
  •  Query and populate 
Useful variable declarations
ODE collects counts as specified in the Useful variable declarations section. This exercise will identify earliest and latest generatedAt time for bad data. These counts can be used in cleanup verification. 
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1200
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1202
  •  WYDOT
  •  DataHub
  •  SDC
4/17/2019
11
  •  Identify the quantity of invalid
records and the quantity of S3 files containing invalid records

Run the queries mentioned in the Query the Count of Invalid Data section above.

Note: BSM and TIMs Data Types may require separate cleanup processes as affected BSMs can likely be identified and removed without additional analysis. TIM messages will likely need additional effort to isolate affected TIM due to inability to re-upload unaffected Broadcast TIMs. Since DataHub repository only contains Broadcast TIM which were not affected by the bug, no action is required for DataHub TIM cleanup. Only invalid BSMs received from 12/3/2018-dateOfBugFix will need to be removed. SDC would need to remove all received TIMs for the received period of 12/3/2018 - 3/12/2019. WYDOT has already removed all invalid TIM and the original invalid BSM. Only BSM received from 2/13/2019 - dateofBugFix must be removed and uploaded again. WYDOT, SDC and DataHub will have all BSM records refreshed after the bug fix is deployed to WYDOT PROD server.  WYDOT should review whether their deduplication software will remove erroneous data or retain the erroneous data. 

Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1200
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1202

  •  WYDOT
  •  DataHub
  •  SDC
4/19/2019
12
  •  Identify the names of the S3 files containing invalid records 
Run the query mentioned in the Query a List of Invalid Data Files on S3 section above.  
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1200
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1202
  •  WYDOT
  •  DataHub
  •  SDC
4/19/2019
13
  •  Identify the log file (original data file) names containing invalid records 
Run the query mentioned in the Query a List of source data files resulting in invalid data section above.  
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1200
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1202
  •  WYDOT
  •  DataHub
  •  SDC
4/19/2019
14
  •  Create report summarizing findings of invalid record generatedAt times, count, and S3 and log file names
The results of the queries from tasks 9-12 should be aggregated into a report for summary and understanding to be presented to the product owner.  
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1200
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1202
  •  WYDOT
  •  DataHub
  •  SDC
4/19/2019
15
  •  GO/ NO-GO on DATA REMOVAL


COORDINATION MEETING CONFIRM ALL TEAMS' DATA REMOVAL PLANS WITH ALL STAKEHOLDERS ODE-1212 GO/ NO-GO on DATA REMOVAL


  •  Verify DataHub  (reference task =
    Jira Legacy
    serverSystem JIRA
    columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
    serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
    keyRDA-669
    )
  •  Verify SDC (reference task: DOTAS-721)
  •  WYDOT
  •  DataHub (GO)
  •  SDC

4/26/2019

Recovered Schedule and conducted meeting on 4/19/2019

4/19/2019
16
  •  Perform cleanup

Once confidence in the summary findings is gained, the Lambda function used to run the queries in tasks 9-12 should be modified to delete the S3 files that are found using the queries. Running this function will be the cleanup step. 

Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1212
 -→  
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1203
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1204

Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyRDA-670

  •  WYDOT
  •  DataHub
  •  SDC
4/19/2019
17
  •  Rerun queries to ensure that cleanup was successful

The queries mentioned in the above section for Pseudo-queries for Validating the Clean Up:

  • Query the Count of Valid Data
  • Query a List of Valid Data Files on S3
  • Query a List of source data files

should be run and ensured that the expected output matches the actual output.   

Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1203
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1204

  •  WYDOT
  •  DataHub
  •  SDC


18
  •  Validate that the cleanup performed the corrective actions
The Validation Checklist above should be iterated to ensure that the cleanup actions deleted the invalid data and did not affect the valid data.

In addition to the automatic validation steps using the queries, manual inspection of the bucket should be performed as a sanity check. 
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1203
Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1204
  •  WYDOT
  •  DataHub
  •  SDC


19
  •  GO/ NO-GO ON WYDOT RE- UPLOAD

COORDINATION MEETING TO CONFIRM GO/ NO-GO ON WYDOT RE- UPLOAD

Jira Legacy
serverSystem JIRA
columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
keyODE-1269

DataHub and SDC expect the following to be re-uploaded by WYDOT:

  1. ALL BSM (bsmTx, bsmLogDuringEvent, rxMsg), TIM (rxMsg, dnMsg), Driver Alert (driverAlert) files received from 12/3/2018 - 4/11/2019 inclusively need to be re-uploaded.


  •  Verify DataHub  (reference task =
    Jira Legacy
    serverSystem JIRA
    columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
    serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
    keyRDA-751
    )
  •  Verify SDC (reference task: DOTAS-722)

4/26/2019
20
  •  WYDOT Starts Re-upload of historical data

WYDOT team starts re-uploading of all data files that were identified during the analysis phase as invalid and deposits them to respective data stores.


  •  WYDOT sends e-mail communication when re-upload begins and communicates if there is any risk to meeting Step 20 below by 5/3/2019

Due to inconsistencies between data stored in WYDOT database and SDC, investigations was initiated with the following results: 

  • It was concluded that Lear device is putting duplicate records in the data files that are being uploaded to ODE
  • Lear will be fixing the issue in the next release but we are not sure when that release will come and be deployed. Therefore, we must do with the current release and post-process the data to remove duplicates.
  • Both WYDOT and SDC have de-duplication in place before data is deposited to the Oracle database and SDC Data Warehouse. However, the de-duplication is potentially flawed as it does not take the source of the BSM “metadata.bsmSource” into account. As is, the deduplication could potentially remove BSMs that are uploaded by different vehicles (EV or RV).
  • We do not know and cannot easily determine if all the data has been deposited to SDC and WYDOT due to the excessive load during the initial re-upload, Lear duplicate record issue and deduplication deficiencies.
  • Currently the deduplication is based on the following field values:
    • MSGCNT
    • SECMARK
    • POSITION_LAT
    • POSITION_LONG
    • POSITION_ELEV
    • RECORD_GENERATED_AT
    • LOG_FILE_NAME
  • BSM_SOURCE must be added to the list to perform a more correct de-duplication of BSM data
  • Brandon will be out of the office 5/20-5/22 and will available to start re-upload on 5/23

WYDOT

4/30/20194/30/2019
21
  •  De-dup correction
  •  Brandon has added the additional BSM_SOURCE field in the de-duplication algorithm which is under test on DEV. Brandon will deploy the new de-dup to PROD by 5/15/2019 00:00:00 UTC. If unable to complete these tasks by 5/15/2019 00:00:00 UTC, we could simply record the UTC time when the new de-dup process is in place and will use that time stamp for the cleanup (WYDOT de-dup timestamp).
  •  SDC will implement the fix to de-dup process. SDC will record when the de-dup process is deployed as the “SDC de-dup timestamp” for removal of data.
  •  DataHub will propose to Ariel to implement de-dup process as part of the Folder Restructuring effort. (Michael Middleton (Unlicensed)Lien, Julia [USA] (Unlicensed))
  •  WYDOT de-dup timestamp = 2019-5-15T00:00:00.000Z
  •  SDC de-dup timestamp = ???
5/14/20195/14/2019
22
  •  Partial/Duplicate Data Removal
  •  Brandon will remove all BSM, Received TIM and Driver Alert data from 12/3/2018 00:00:00 UTC through 5/15/2019 00:00:00 UTC or whatever time the new de-dup is deployed from Oracle database. TIM Broadcast data shall not be removed. (@Brandon Payne)
  •  SDC will remove all BSM, Received TIM and Driver Alert data from 12/3/2018 00:00:00 UTC through the “WYDOT de-dup timestamp” from the Data Warehouse and correct any missing data in the data warehouse after the fact without impacting other teams. (@Chupp, William (Volpe), @Mayorskiy, Vyacheslav CTR (Volpe)). 
  •  DataHub will also remove all data from 12/3/2018 00:00:00 UTC through 5/15/2019 00:00:00 UTC from the Sandbox. (Matthew Schwartz (Unlicensed), Lien, Julia [USA] (Unlicensed))
  •  De-dup correction and Data Removal must be completed by close of 5/22/2019 so when Brandon returns and starts re-uploading data all repositories are ready.
  •  WYDOT 5/14/2019
  •  SDC
  •  DataHub


5/22/2019
23
  •  Start Re-upload Round 2

On 5/23/2019, Brandon will re-upload all data from 12/3/2018 00:00:00 UTC through 5/15/2019 00:00:00 UTC.

WYDOT5/23/2019
24
  •  DataHub Release Note

DataHub will post a Release Note indicating that there are and will be duplicate records in the data until further notice. When that is depends on Lear’s fixing of the firmware. (Lien, Julia [USA] (Unlicensed)Michael Middleton (Unlicensed))

BAH5/23/2019
25
  •  Update this page

Hamid Musavi (Unlicensed) will update the metadata bug Confluence page with current status, understanding and action plan. Meanwhile, copying @Ariel.Gold@dot.gov so she is informed of where we are in the re-upload process.

BAH5/17/2019
26
  •  VERIFICATION OF SUCCESSFUL RESTORATION OF DATA

COORDINATION CLOSE OUT E-MAIL REPORTS TO CONFIRM VERIFICATION OF CLEAN UP COMPLETED


  •  After WYDOT upload, confirm that data in WYDOT BSM data set on DTG all clean 
    Jira Legacy
    serverSystem JIRA
    columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
    serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
    keyRDA-760


  •  WYDOT
  •  DataHub
  •  SDC
6/17/2019

5/3/2019 held meeting, but "NO-GO" - Data re-upload issue resolution ongoing with daily meet-up:

  • As of 5/14/19 data duplication bug uncovered in Lear source data to WYDOT
27
  •  Communicate to data users

Communicate to all data users of the WYDOT, SDC, and document on DataHub that the cleanup is complete

  • Verify DataHub  (reference task =
    Jira Legacy
    serverSystem JIRA
    columnskey,summary,type,created,updated,due,assignee,reporter,priority,status,resolution
    serverId31b0dfa6-4fa6-3ecf-9b8d-a4943d56253a
    keyRDA-671
    )
BAH5/31/2019

...

This is an update regarding errors in the Wyoming metadata that was communicated to the user community on 3/8/2019. The bug manifested itself in incorrect metadata fields in the Wyoming Basic Safety Message (BSM) and Traveler Information Message (TIM) data. All payload data included in BSM and TIM are unaffected and entirely correct.

...

As noted previously, the bug manifested itself in incorrect metadata fields in the Wyoming Basic Safety Message (BSM) and Traveler Information Message (TIM) data. All payload data included in BSM and TIM are unaffected and entirely correct. The ODE software version that contributed to the Metadata field errors had been corrected and deployed to Wyoming production server on 4/11/2019. No more invalid data is being deposited to DataHub as of 4/12/2019 and all existing invalid data have been removed as of 4/26/2019.

...

We appreciate your patience as we work through resolving these metadata errors. If you have any question about these issues or have additional issues to report, please contact RDAE_Support@bah.comcontact <insert appropriate email address here>.


DataHub

Below is a proposed comm tailored to be posted on ITS Sandbox http://usdot-its-cvpilot-public-data.s3.amazonaws.com/index.html. It contains fewer contextual details as it'll appear with the past notifications. Please review the language and reply to all with your feedback.

...