HI all,
I have a Supermicro Blade enclosure with one TwinBlade, blade unit installed. It is a dual node blade with two server boards and sets of hardware. each node has two 1GB Intel nics and I also have an intel two port 10GBe mezzanine card on each node. The back of the blade enclosure has a 20 port 1GBe passthru switch that is connected to a Cisco Catalyst 3560-X switch and a 10GBe managed switch.
I have installed ESXi 5 on both nodes with identical hardware and configurations in the BIOS. Node 1 shows 4 VMNICs. 2 for the 1GBe and 2 for the 10GBe. 1 port on the 1GBe shows connected as does 1 port on the 10Gbe. The other two are disconnected. This is what I expected and everything is fine.
Node two is different for some reason. It shows the same 4 VMNICs but only has one port showing connected and that's the 10GBe one. It should also show a connected status for one of the 1GBe ports. I know 100% that the 1GBe nic is working and the switch configs are all good. In the BIOS of the node there is an IPMI area that I have configured an IP address to. This uses the 1GBe nic. The switch shows it up, I can ping it, and I can log into the IPMI via a web browser with its IP address and do a remote console to the node just fine.
The other thing that is weird is in the mangament network screen on ESXi shows AGP under hardware for VMNIC 02 and N/A for the other 3. Node 1 shows N/A for all four. I have never seen this AGP thing and not sure what it means. Also on node 1 VMNIC 02 is the 1GBe that is conected and I excepted it to be the same for node 2.
Any ideas on why ESXi is showing it disconnected. Right now I can only connect to it if I use the 10GBe port.
Thanks
Chris