Tuesday, December 20, 2011

Why does Vista allow sending oversize packets on a 100Mbps LAN when using a gigabit adapter?

A fellow developer is using winpcap dll's to grab packets on Vista with his app. Somehow, when using IE7, the GET request to access www.yahoo.com, results in a single oversize packet (over 1500 MTU). Also, every once in a while, the oversize packet has an IP datagram length of zero. The payload has a large cookie, but is otherwise normal looking. This is with a clean Vista install, no firewall on, no antivirus. I can understand things like packet checksums being bad if there was TCP checksum offloading. Even uming the gigabit adapter has large send offload and is using it (so it fragments the packet after winpcap grabbed it), that doesn't explain the zero IP datagram length. Is this something specific to IE 7 or to the new Vista TCP/IP stack? I vaguely remember someone saying that in the case of Vista and a gigabit network adapter hooked to a 100Mbps LAN, it will sometimes spit out oversize packets, and nobody complains because the majority of switching hubs have gotten good enough to fragment on their own or in the rare case actually refuse the packet forcing Vista to do it right, so this is a silent error situation. The packets are usually in the 1600's in size.

No comments:

Post a Comment