Access Duplicate Records

MHamid

Active Member
Joined
Jan 31, 2013
Messages
472
Office Version
  1. 365
Platform
  1. Windows
  2. MacOS
Hello,

I need some suggestions.

I have a Form that will add records to a table.
These are the fields that will need to be filled using the form:
Entered By
Date Entered
Cycle Month
Report Type
Date Reviewed
Reviewer Type
Reviewer
Reviewer Report Area
Main Section
Topic Section
Ownership-Individual
Ownership-Team
Count
Priority
Approved By
L1 (Quality Category L1)
L2 (Quality Category L2)
L3 (Quality Category L3)
Other (If Other is selected in L3)
Repeat Ask (check box)
Exception (check box)
Notes
Outliers
Additional Outliers

I have a RefID column that is an auto-generated number.
I need to ensure that when a record is added that it is not being duplicated. However, based on the columns mentioned above, there is no customer name or account number that will make this task easy to ensure that there are no duplicates.
In looking at each column, they can be duplicate in of itself, but the entire row cannot be the same.
How can I ensure there will be no duplicates added to the table by checking if all columns are the same?


Thank you
 
Last edited:
I have 21 fields. Will each suggestion accommodate for 21 values.

Also, how can I go about doing what you're suggesting in SQL?
 
Upvote 0

Excel Facts

Back into an answer in Excel
Use Data, What-If Analysis, Goal Seek to find the correct input cell value to reach a desired result
Sorry, is this what you are referring to? A query, but going into the SQL View to type the code below.
If this is what you are referring to, then how would I be able to incorporate this query into the database?

Code:
select * from QAMaster
where EnteredBy = Me.EnteredBy
and DateEntered = Me.DateEntered
and CycleMonth = Me.CycleMonth
and ReportType = Me.ReportType
and DateReviewed = Me.DateReviewed
and ReviewerType = Me.ReviewerType
and Reviewer = Me.Reviewer
and ReviewerReportArea = Me.ReviewerReportArea
and MainSection = Me.MainSection
and TopicSection = Me.TopicSection
and OwnershipIndividual = Me.Individual
and OwnershipTeam = Me.Team
and Count = Me.txtCount
and Priority = Me.PriorityLevel
and ApprovedBy = Me.Approved
and L1 = Me.L1
and L2 = Me.L2
and L3 = Me.L3
and RepeatAsk = Me.Repeat
and Exception = Me.Exception
and Hyperlink = Me.txtHyperlink
 
Last edited:
Upvote 0
That's a very large number of fields to check. Are you sure you will have duplicates on so many fields? And if so, why not just add the duplicate, since you can always filter duplicates out later.


Note: As far as the specific answer to your question, you'd have to turn that into runnable sql text (which means adding quotes to character data and hashes to dates, etc.):

Code:
strSQL = "select * from Table1 where Field1 = " & me.Ctrl1 & " and Field2 = "'" & me.Ctrl2 & "'" & " and Field3 = #" & Field3 & me.ctrl3 "#"

That is very error-prone at best so you have to really be careful, especially with so many fields I think you will have problems.
 
Last edited:
Upvote 0
That's the problem I'm facing because each field on it's own can be duplicated, but the full record should not be duplicated.
It's being requested that we trap the duplicate so that it will not be added to the table. I did read about a query that can remove duplicates, but I don't think my requestor will like that because we want to run a report with valid data and if there are duplicates then that will give us an inaccurate count. That's why I have to figure out a way to trap it even if it means writing a complicated code.

I will try out writing the sql into runnable sql text. Once I'm done writing it into runnable sql text, would I need to call it in one of the events of the form?
 
Upvote 0
To the last question, yes.

Also a bit easier would be to index the table on the fields that cannot be duplicated. You just create a unique index on those fields. Then you only have to trap the error if the insertion fails (which you can test by attempting to insert a duplicate).

Note that if dateentered is the date and time (rather than just the date) then duplicates are very unlikely.

Also it might be better to think in terms of keys. What are the primary keys of this table? They should not be duplicated, but the rest shouldn't matter (since by definition they depend on the primary key, if the primary is properly defined anway (big if since you probably won't know if that's true or not)). If all of these fields are your primary key, you should create the primary key, which also solves the problem (since primary keys by definition are unique indexes).
 
Upvote 0
Below is a sample of the data to be recorded. The columns in red will need to be updated for every record. The columns not in red can be duplicated ... ignore RefID. DateEntered column is just short date with no time stamp.

If I update the fields in red to primary keys does not mean that those fields will never repeat in the column itself at all?
RefIDEnteredByDateEnteredCycleMonthReportTypeDateReviewedReviewerTypeReviewerReviewerReportAreaMainSectionTopicSectionOwnershipIndividualOwnershipTeamCountPriorityApprovedByL1L2L3OtherRepeatAskExceptionNotesOutliersAdditionalOutlierHyperlink
113mh156011/9/2018NovemberCBNA MOR12/22/2017Mgr FeedbackNick WoodyRegulatoryAppendix Section - O&T Risk and Control – CBNA Technology Risk Appetite – Key Indicators (KIs) Additional Operational Risk MetricsTechnology Risk Appetite – Key Indicators (KIs) - KIs Monitored via TrendingDavid HarringtonACM-ILC MGR3P1 Data_AccuracyMisclassification_of_CategoryIssue Severity Level FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#
115mh156011/9/2018MarchCBNA MOR1/1/2018Mgr FeedbackAndre BatistaRegulatoryAppendix Section - O&T Risk and Control – CBNA Technology Risk Appetite – Key Indicators (KIs) Additional Operational Risk MetricsTechnology Risk Appetite – Key Indicators (KIs) - KIs Monitored via TrendingDavid HarringtonACM-ILC MGR1P3 FormattingReporting_Enhancement_RequestReporting Enhancement Request - New Bullet Category FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#
122mh156011/9/2018NovemberCIOC1/1/2018Mgr FeedbackNick WoodySelf IdentifiedAppendices - Regulatory Examinations / Activities Impacting TechnologyRegulatory Examinations / Activities Impacting Technology - Report PendingMichikoREG-RAM2P3 FormattingReporting_Enhancement_RequestRetire Slide FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#
125mh156011/10/2018DecemberOTRC MOR1/1/2018Mgr FeedbackNick WoodyRegulatoryAppendices - Regulatory – Additional TopicsGlobal Regulatory Issues – Aging Issues* / MRAsMike BillupsACM-Metrics1P3 ConsistencyPrior_Current_ComparisonOthertestingFALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#
127mh156011/10/2018DecemberOTRC MOR1/1/2018Mgr FeedbackNick WoodyRegulatoryAppendices - Regulatory – Additional TopicsGlobal Regulatory Issues – Aging Issues* / MRAsMike BillupsACM-Metrics2P3 FormattingReporting_Enhancement_RequestReporting Enhancement Request - Table/Chart Modification FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#
128mh156011/10/2018NovemberOTRC MOR1/1/2018Mgr FeedbackNick WoodyRegulatoryAppendices - Regulatory – Additional TopicsGlobal Regulatory Issues – Aging Issues* / MRAsMike BillupsACM-Metrics2P1 ConsistencyOther_InconsistencyMetric - Detail Mismatch FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#
130mh156011/10/2018DecemberCIOC1/1/2018Mgr FeedbackNick WoodySelf IdentifiedAppendices - Regulatory Examinations / Activities Impacting TechnologyRegulatory Examinations / Activities Impacting Technology - In Progress ExamsMichikoREG-RAM2P3 FormattingGrammarBold FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#
133mh156011/11/2018JanuaryCIOC1/1/2018Mgr FeedbackNick WoodySelf IdentifiedAppendices - Regulatory Examinations / Activities Impacting TechnologyRegulatory Examinations / Activities Impacting Technology - In Progress ExamsMichikoREG-RAM1P1 Data_AccuracyMisclassification_of_CategoryRisk_Designation - No Longer At Risk FALSEFALSE \\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#

<tbody>
</tbody><colgroup><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col></colgroup>
 
Upvote 0
If you create a primary key then that key (i.e., whatever fields are in the primary key, whether one or more than one) will always be unique and cannot be duplicated.

That said, notes should never be part of a primary key - especially since notes could often be blank or null. Primary keys can never be left null.

You can also create unique indexes, which are not technically primary keys but would work similarly. I think that with unique indexes you can allow values to be null, however, which would be a crucial difference.

The only thing that is strange about this is why you would have so many fields that would be potentially duplicated. On a practical level, it would really take a lot of work on the part of a user to duplicate all that information - everything goes back to your data though.
 
Upvote 0
This is just the type of data that is to be recorded for each QA Finding.
I'm pretty certain that primary key is not going to work for me in this case.

I'll check out unique indexes and see if that will work.
 
Upvote 0
Hello,

So I decided to concatenate my fields. I created a concatenation column in my table and added that field to the form. The txtConcatenation is bound.
I am using the code below in the txtHyperlink field. This is the last field that is required of the user to update. So far no issues using this method to prevent duplications.

Code:
Dim DupRecord As String
Dim stLinkCriteria As String

DupRecord = Me.txtConcatenation.Value
stLinkCriteria = "Concatenation='" & DupRecord & "'"
If Me.txtConcatenation = DLookup("[Concatenation]", "QAMaster", stLinkCriteria) Then
    Call ClearForm_Click
    MsgBox "Record has already been added." _
    & vbCrLf & "Please verify data and try again.", vbInformation, "Duplicate Entry!"
End If

Does anyone see any issues using this method to prevent duplicates?
 
Upvote 0
The principle is sound. So it should work, as long as you have done everything properly. Minimally, you should test your code by attempting to enter a known duplicate record.
 
Upvote 0

Forum statistics

Threads
1,214,429
Messages
6,119,424
Members
448,896
Latest member
MadMarty

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top