At this particular moment, no, they're really not usable together unless you're using them as compute cards (ie: folding@home, bitcoin mining etc). To use them both as display cards in gaming just isn't possible for this particular reason: At a base (and simplified) level the OS displays by writing info to the graphics card via the driver which then displays on the monitor which is connected to the card. On boot, the graphics system identifies where the monitor is connected to, then the OS loads the appropriate driver for the graphics system. To switch between cards, you'd have to shut down, move the cable, and restart - and it's just not worth the effort.
Now, with DX12 coming out, they have a multi-adapter option which can do exactly what you're talking about. It enumerates the available graphics hardware regardless of the manufacturer, then will enable those hardware resources and utilize them be it IGP, AMD or nVidia. It just makes ALL graphics hardware a single pool and 'knows' what to do with them. However, those options have to be specifically enabled by the game dev when they write the game, and it would be (I imagine) a whole hell of a lot of work. Yes, it's a great idea, but I don't think any games currently make use of that option.