Epistemological holism or confirmation holism, I take it, holds that sentences cannot be justified or disconfirmed in isolation. In other words, we can only fundamentally justify or disconfirm sufficiently large conjunctions of individual claims. What counts as ‘sufficient’ depends on how big a chunk of theory you think is needed to be justifiable: some holists allow big sets of beliefs to be confirmed/justified even if they are proper subsets of your total belief-set. (I will use 'individual claim' to mean a claim that is 'too small' i.e. 'too isolated' to admit of confirmation, according to the holist.)
I’m guessing confirmation holists also think that the individual claims that make up a justified whole are themselves justified. (If holists didn’t think that, then it seems like any individual claim a holist made would be unjustified, by the holist’s own lights, unless they uttered it in conjunction with a sufficiently large set of other utterances.) The individual claims are justified, presumably, by being part of a sufficiently large conjunction of claims that are fundamentally/ basically justified. Individual claims, if justified, can only be derivatively justified.
Presumably, if one believes that ‘A1 & A2 & … & An’ (call this sentence AA) is justified, then that person has (or: thinks they should have) a rational degree of belief in AA over 0.5.
But now I have questions:
(1) How does a holist get from the degree of belief she has in AA, to the degree of belief she has in a particular conjunct? There are many, many ways consistent with the probability calculus to assign probabilities to each of the Ai’s to get any particular rational degree of belief (except 1).
(2) We might try to solve that ‘underdetermination’ problem in (1) by specifying that every conjunct is assigned the same degree of belief. This seems prima facie odd to me, since presumably some conjuncts are more plausible than others, but I don’t see how the holist could justify having different levels of rational belief in each conjunct, since each conjunct gets its justification only through the whole. (Perhaps the partial holist can tell a story to be told about claims participating in multiple sufficiently large conjunctions that are each justified?)
(3) Various ways of intuitively assigning degrees of belief to the individual conjuncts seem to run into problems:
(i) The holist might say: if I have degree of belief k in AA, then I will have degree of belief k in each conjunct. Problem: that violates the axioms of the probability calculus (unless k=1).
(ii) Alternatively, if the holist wants to obey the axioms of the probability calculus, then the rational degree of belief she will need to have in each conjunct must be VERY high. For example, if the degree of belief in AA is over 0.5, and each conjunct is assigned the same value (per (2)), and there are 100 individual conjuncts, then one’s degree of belief in each conjunct must be over 0.993. And that seems really high to me.
(iii) One alternative to that would be to say that each conjunct of a large conjunction has to be over 0.5. But then you would have to say that the big 100-conjunct conjunction is justified when your rational degree of belief in it is anything above 7.9x10-31. And that doesn’t sound like a justified sentence.
Two final remarks: First, it seems like someone must have thought of this before, at least in rough outline. But my 10 minutes of googling didn’t turn up anything. So if you have pointers to the literature, please send them along. Second, for what it's worth, this occurred to me while thinking about the preface paradox: if you think justification only fundamentally accrues to large conjunctions and not individual conjuncts, then it seems like you couldn’t run (something like) the preface paradox, since you couldn't have a high rational degree of belief in (an analogue of) the claim ‘At least one of the sentences in this book is wrong.’