From: Olof Johansson Date: Mon, 13 Jun 2005 22:52:27 +0000 (-0700) Subject: [PATCH] Fix PCI BAR size interpretation on 64-bit arches X-Git-Tag: v2.6.12~21 X-Git-Url: https://err.no/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=f797f9cc5485b50c35c106b462e1bc432ec37f90;p=linux-2.6 [PATCH] Fix PCI BAR size interpretation on 64-bit arches On 64-bit machines, PCI_BASE_ADDRESS_MEM_MASK and other mask constants passed to pci_size() are 64-bit (for example ~0x0fUL). However, pci_size does comparisons between the u32 arguments and the mask, which will fail even though any result from pci_size is still just 32-bit. Changing the mask argument to u32 seems the obvious thing to do, since all arithmetic in the function is 32-bit and having a larger mask makes no sense. This triggered on a PPC64 system here where an adapter (VGA, as it happened) had a memory region base of 0xfe000000 and a sz of the same, matching the if (max == maxbase ...) test at the bottom of pci_size but failing the mask comparison. Quite a corner case which I guess explains why we haven't seen it until now. Signed-off-by: Olof Johansson Acked-by: Greg KH Signed-off-by: Andrew Morton Signed-off-by: Linus Torvalds --- diff --git a/drivers/pci/probe.c b/drivers/pci/probe.c index b7ae87823c..fd48b201eb 100644 --- a/drivers/pci/probe.c +++ b/drivers/pci/probe.c @@ -125,7 +125,7 @@ static inline unsigned int pci_calc_resource_flags(unsigned int flags) /* * Find the extent of a PCI decode.. */ -static u32 pci_size(u32 base, u32 maxbase, unsigned long mask) +static u32 pci_size(u32 base, u32 maxbase, u32 mask) { u32 size = mask & maxbase; /* Find the significant bits */ if (!size)