AOC Reveals 16-Inch Monitor That Connects Via USB Type-C

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
What would be the use for a monitor like this? Im sure there is im just not positive.

I could see this being a secondary (or 3rd/4th) monitor for updates/highlights. or potentially if it had touch, adding a touchpad to a computer.

maybe as a secondary to a laptop? or even a tablet?
 
Hdmark, this is something I would absolutely buy. I bring my mini-itx PC over to my friends' often enough that I split a second monitor with him so that I can use it when I come over.

With this monitor, I could just throw it in the same case as my computer and be set for traveling.
 

USB-C requires extra circuitry to route USB3 and DP signals through the connector, HDMI and DP don't. If you put USB port on graphics cards, GPUs would have to add USB circuitry to display ports which would drive up complexity and cost.

Unless displays with USB-C become common, I wouldn't expect GPUs to use type-C any time soon.
 
I think that what we actually need is for motherboard manufacturers and gpu makers to develop a (low latency) way to use the gpu, but route the video signal through the motherboard, in order to use the PCs other usb-C.

I mean, graphics card would route its video signal through PCI, that though the motherboard, and that out through the motherboards USB-C.
 
ASUS MB169C+ is already available with nearly the exact same specs. Not sure about the automatic vertical orientation switching thing, but that can be done thru Windows regardless.
 


That's exactly why I need it. I'm a freelancer. When I go to a client's office and walk them through some work or concepts, only one other person has a good view. If there are 3-4 others, I can have this mirror the display for the other side of the small conference table. In a big meeting room, I just broadcast to the video conference screen, but in an executive's corner office with a small table, it's ideal.
 


It is more of a proof of concept if anything, USB-C could drive (as in supply power and signals) bigger monitors as well. Only problem is that GPU card and motherboard manufacturers are neglecting the tech as developing the power routing would cut into their bottom line.

I personally would love to use single cable monitors.
 
I've actually started looking at similar monitors that run on regular USB cables to replace a digital picture frame we've used at our reception desk which scrolls through photos of recent work. The digital frame is getting flaky after many years and has to be manually loaded with the pictures; the USB monitor could run multimonitor with the receptionist computer and display a slideshow app that's pointed to a folder on our server. Powered only by a USB cable is a neat, tidy solution, and this type of monitor tends to have an easel stand like a picture frame.
 

The primary purpose of these types of USB 2.0 or USB 3.0 monitors is portability. If you're traveling and want more than your laptop monitor, you would buy 1 or 2 of these.

Here is an Asus MB168B 15.6" WXGA 1366x768 USB Portable Monitor. It's inexpensive and has a 3 Year Warranty with 2 way free shipping. You would connect 2 or more of these monitors to your laptop via a Plugable 7 port USB 3.0 hub - 25W Powered USB HUB.

If you wanted to go all out, using USB 2.0 and USB 3.0 hubs, you can run up to 14 monitors on a laptop if you wish, as long as you don't game with them.
 


That would still be a pipe dream. Taking a random sample of monitors in my office (22" to 24" HP/Dell panels). They're rated for 115v & 1.5A to 2.0A ~= 175w to 230w (this is ignoring the AC -> DC loss which ranges from 10% to 60% but I don't have the data on hand for these monitors so you'll just have to bear with me). The USB PD spec tops out at 100w. An updated spec would be needed as 20v at 5A won't be enough. But, this probably won't be the biggest hurdle...

Video cards will need to deliver enough power to drive 1 panel? 2 panels? 6 panels? That's 175w to 1000w. The control logic is simple, modern video cards contain billions of transistors, it will take a few thousand to get the PD spec functional which is essentially rounding error... But the power problem hasn't been addressed yet. For the video card, that's simple, just add a pass-through and pull the power from the power supply.

So about that power supply... OEMs tend to skimp, your typical Dell/HP/etc desktop comes with a 250w PSU. To support a single monitor, they would have to start including 400-500w PSUs. Your Fancy AMD Eyefinity/NVidia Surround card will need a 1200w PSU to supply all 6 monitors.

All that doesn't take into account the DC 12v to DC 20v step up that's needed. Modern PSUs only have a 12v rail. To make this remotely feasible, ATX spec would have to be augmented to add a 20v rail. This will certainly make Seasonic/HiPro/etc happy...

A few other things to consider: This is only taking into account 24" panels, there's still 27"/30"/32" panels that I don't have power data for. Or what about 75" UHD TVs? USB hubs built into the monitor (do they need to support USB PD too)? Monitor speakers?

TLDR: you're going to make Dell/HP cry
 

Instead of reading the label on the back of the monitor which lists the absolute worst case ratings, try using a watt/VA-meter to measure actual power draw. My 24" Dell UltraSharp only uses 50W at max brightness and 30W at my normal brightness level.
 

I'm at work so I can't put a lot of research time into this topic but...

That's the thing, to enable technology as widespread as USB. I have to handle the worst case. That includes very marginal things that are 2 standard deviations below average that gets bought/sold on eBay.

For your case, you have an Ultrasharp as well, It should have come with a USB hub built in. Usually they have 4-6 USB ports + a DC jack on the back for the soundbar. Conservatively, 4 USB 3.0 x 1A each at 5v + your 10w (Dell AX510) under the monitor speaker adds about 30w. Now your monitor is 80w. There could also be debug circuits that aren't usually powered on by the customer. For U3014 (which I have at home), if i hold button 1 and 4 I enable a built in diagnostic controller (looks to be a built in micro-controller and image generator). There's still more that Dell technicians have access to that I can only guess at (JTAG/SPI? something is used to flash the firmware). Add in-rush current and a factor of safety for manufacturing variation and 175w isn't that insane...

I work in a related field, I'm giving the perspective of an engineer that would be tasked to do this. For a monitor, I would start with the rating on the back. They put that 175w value there for a reason and unless I know the details I can't assume they simply made those ratings up. Doing anything else would be reckless.

All that said, the monitors in my office are 2-4 years old (but LED backlit), its very possible that newer monitors are a bit more efficient. This exercise was meant to be a 10 minute napkin calculation :)

 
Status
Not open for further replies.