True Pedigree Errors More Frequent Than Apparent Errors for Single Nucleotide Polymorphisms

Abstract
Single nucleotide polymorphisms (SNPs) are currently being developed for use in disequilibrium analyses. These SNPs consist of two alleles with varying degrees of polymorphism. A natural design for use with SNPs is the ‘haplotype relative risk’ sampling design in which a father, mother, and child are typed at an SNP locus. Given such a trio of genotypes, we ask: what is the probability that a pedigree error (a change from one allele to the other) at an SNP locus will be detected using only Mendel’s laws as a check? We calculate the probability of detecting such errors for a hypothetical SNP locus with varying degrees of polymorphism and for various true error rates. For the sets of allele frequencies considered, we find that the detection rates range between 25 and 30%, the detection rate being lowest when the two alleles have equal frequencies and the highest when one allele has a frequency of 10%. Based on this detection rate, we determine that the true error rate is roughly 3.3–4 times that of the apparent error rate at an SNP locus. The greatest discrepancy between true and apparent error rates occurs when allele frequencies are equal.

This publication has 1 reference indexed in Scilit: