I am confused about one of the assumptions made when doing indifference curves. The assumption I am talking about is the monotonicity assumption, which is generally described as the ‘more is better’ assumption. That is to say, suppose there are n commodities in both c1 and c2. Monotonicity means that if c1 contains more of some or all commodities, but no less of any, than c2 (c1 ≥ c2) then c1 ≥* c2 (where ≥* means weakly preferred). **But doesn’t this assumption fail when applied to the perfect complementary in difference curve model?
The basic concept is that X and Y are consumed at a fixed proportion. The only way to get utility are at points like A and B. The intersections of the indifference curves (ICs) are the only way one can get utility. Anything outside this intersection point gives an extra utility of zero. So at point C, the utility would only be that of A. But doesn’t this contradict this ‘more is better’ assumption. As Shon says, from the University of Chicago, an increase of a commodity, even if the other commodity is fixed, would still be preferred, and this is shown by the increase of utility. Nevertheless, in this prefect complementary model, an increase of a commodity, such as X1 to X3, would result in a utility equal to A, thus it is not preferred if there was an increase of commodity X.
Am I just not understanding the implications of this assumption, or is it legitimate to conclude that this is a contradiction?