Minor note for those wanting to try this at home - these cheap dummy plugs only have a 256 byte eeprom, which is not enough for storing the various extended EDID blocks needed to specify high-refresh high-resolution configs. If you just want 1080p60 they're fine, but you won't be able to simulate a 4k240 monitor with them.
Also, some of them have the write-protect line pulled high (or low? don't remember) and you'll need a bit of surgery to actually write to them.
One caveat of these dummy plugs is that they don't do HDCP. They handle the typical use case of forcing a specific resolution output for headless machines rather well, but fail for the use case that you need to run something that expects HDCP.
This seems a good place to ask: does anyone know of a good solution like this HDMI dummy plug, but that negotiates HDCP? I need to test video streaming apps that require HDCP to play at full resolution, but it is inconvenient to have a full TV for every test.
The one solution I've found is an HDMI multiviewer, which seems to negotiate HDCP to each port individually.
I use this HDMI splitter. It lets you either set a preprogrammed EDID, or learns an EDID from whatever you plug into HDMI output 1, and then shows up as a connected monitor for as long as the splitter's plugged in without having to connect anything to the outputs. I believe it negotiaties HDCP between the computer/console/whatever and the splitter, then sends the signal to the output monitor without HDCP. https://www.amazon.com/dp/B07VP37KMB
I have a different usecase. I have an embedded system that sends out HDMI. However, its boot screen is something I want to replace by another HDMI stream (a static image would suffice). I specifically don't want to change anything on the embedded system for a thousand reasons I won't go into. How do I cheaply and robustly do this?
Probably not the most elegant solution, but there are many HDMI switchers out there that can be talked to using gpio and/or RS-232. You could use one of those and a Raspi, feed the image in from the Raspi, detect the power of your embedded device via GPIO and then switch over to the other HDMI input. Whether that is a feasible solution mostly depends on other requirements, e.g. regarding power consumption or space.
I used a similar solution once in an AV context and it has been running reliably for years now.
If you want something less hacky Blackmagic has the Atem Mini, a good HDMI switcher that can also store multiple stills and can be switched via ethernet (among other things)
What's even crazier is that we get the negative of DRM but none of the upside. I have a 4k hdmi 2.0 TV without hdcp 2, so no 4k content without the "splitter". Also any interoperability issue is a "catastrophic" failure (as in at best no content, at worst no hdmi output at all). And yes they do happen, either because of broken software implementation (some TV don't reset their hdcp state machine when switching hdmi source), or just dumb electrical issue (i2c - and cec - have a habit of dying because of leaking charges, and one needs to unplug everything for 10 min to fix it)
I even occasionally get audio issues on Netflix (AppleTV plugged into Samsung 4k oled tv), which I assume are due to some kind of DRM, though never dug into it.
Sometimes when switching inputs (firetv stick / AppleTV stick), or switching content on AppleTV between different apps, the Netflix content audio just stops working. All app UI sounds work correctly, but no audio once you hit play. Toggling the AppleTV audio settings a few times between dolbyAtmos and standard stereo usually brings it back, so I assume it has something to do drm on the audio tracks, but if anyone has other ideas lmk
I used to run into this often with Paramount+. I don't know if it's still an issue or not because I cancelled over it (plus them showing me ads when I pay for premium).
When I worked in music streaming I made these arguments all the time. The labels would agree with us wholeheartedly about how shitty DRM was but said they demanded it because the artists demanded they do "something" to stop the copying.
This is how the world works. If you want to get rich, you can sell something that doesn't work, to rich people who believe it does. That's basically how YCombinator works.
DisplayPort has supported HDCP (as well as its own DRM scheme DPCP) since version 1.1.
I agree that DisplayPort was better for monitors, but HDMI has basically become DisplayPort so these days they're more or less two sides of the same coin. Both use data packets over fixed rate links now.
You may be thinking of HDMI vs DisplayPort over USB-C, because otherwise they couldn't be more different. In any case, HDMI is still heavily patent and royalty encumbered, to the point it is going to be difficult for opensource GPU drivers to support native HDMI 2.1 or higher going on, while DisplayPort is still royalty free.
The situation is so bad Intel has pretty much skipped native HDMI ports in recent chipset graphics to focus on DP only (motherboards can still install off the shelf DP->HDMI converters), while on AMD the newer HDMI features won't be supported at all on Linux.
> You may be thinking of HDMI vs DisplayPort over USB-C, because otherwise they couldn't be more different.
No, I'm referring to how HDMI 2.1 changed basically everything but the connector to become a multi-lane packet based protocol like DisplayPort instead of being a direct descendant of DVI, which itself was basically digital VGA.
Terminating HDCP is difficult, you’d have to downgrade it to HDCP 1.4 and then have a 1.4 ‘compliant’ (see: device on the end for it to be a dummy monitor. If you need anything newer than HDCP 1.4, it’s likely not possible.
I'm using the Monoprice multiviewer. It negotiates HDCP without a display attached. Other than being a bit big and expensive, and being unable to strip HDCP, it's a good solution.
I found the same device in generic packaging on AliExpress, but haven't had the chance to order that version, yet.
There are lots of professional SDI converters and such, but they are either $3k+ or "call for price".
> The HDCP converter simply announces itself as a final video endpoint… yet still repeats the content to its output port. Without a very expensive HDMI protocol analyzer, we can’t check if the source is tagging the content as type 0 or type 1, but there is no reason now to think that it’s not type 1.
There's no magic in the HDMI protocol that says type 1 vs type 0. Its just another HDCP message over DDC, but it is only sent to repeaters. In this case, since the HDCP Repeater is lying about not being a repeater, it isn't getting sent the StreamID Type information.
Great teardown. Can these things remove HDCP altogether? It seems like if it can report that the sink is HDCP2.x then it can do so even if it has no compliance at all right? So that would mean it streams an encrypted stream to something that needs to then still do the decryption? These devices seem like they'd be underpowered to do that in real time at 18 Gb/s.
Relatedly, is there a good archive of EDID binaries somewhere, or a better program to make them?
I have an nice programmable EDID emulator plug, and I can clone my monitor and others, but there's some times there's a specific resolution or feature I want to set and I don't have a way to. (Like 8K with DSC, etc)
I've also written by own using https://www.analogway.com/products/aw-edid-editor but getting all the nuances of various modes, and setting preference order, etc, is rather difficult, at least for me :)
I have (too much) experience in EDID editing. My suggestions:
- AW EDID Editor as you mentioned.
- CRU is a Windows-only tool, and will modify the EDID files it dumps from monitors (removing serial/etc. descriptors to make room for detailed resolutions), but will work. It does not run under Wine.
- 010 Hex Editor has an EDID template.
- On Linux you can install wxEDID from Flatpak (IIRC the distribution packages would crash in WxWidgets). I don't think it can create sections though.
- v4l-utils has edid-decode (which can be used as a git diff textconv tool), though this does not help you encode EDID files.
I found that HDMI EDIDs have a CEA extension block while DP EDIDs have a DisplayID extension block. I haven't done any work in multi-page EDIDs with over 256 bytes (and don't know what EEPROM chip you'd use to emulate them, nor the protocol or APIs to read and write them).
I had a similar problem recently: I got this cheap 5.1 surround system soundbar that support up to Dolby TrueHD via HDMI. But here is the catch, it only works with eArc enabled devices (new gen TV). If you plug your PC you need to use SPIDF or aux which hamper the quality. One solution beside buying an audio ectractor/splitter is to fake the PC edid to be reconized as eARc by the soundbar. Still yet working on this, no strict guidelines existing unfortunately.
What programmable EDID emulator plug do you use? I was recently looking into them, and it wasn’t clear what devices had what features, or if any (cheaper) ones were any good.
You can also buy these with a passthrough, which is useful for older systems that balk at higher resolution monitors.
I have a 2011 era AMD FX8350 system where the onboard 880G northbridge+video chipset doesn't output video over HDMI correctly with a 4k display. Hooking up one of these inline to tell it to send a 1080p image works great, which the monitor does a 2x integer upscale to 4k.
I also have a couple of passthroughs -- I probably should have mentioned them in the post as another option. The one I have is fancy -- it can read the EDID from a monitor, save it, and use it as an override for another monitor.
Another awesome thing is it can force the monitor to always be detected. One of my monitors virtually unplugs itself when I shut it off, which causes a bunch of issues for me, and the passthrough completely solved it. The one I use is the HD-EWB by THWT.
I guess I should reword the way I said something in the previous message: when I said "it can force the monitor to always be detected", I really should have said "it forces the monitor to always be detected".
Why are dummy plugs a thing? What can you do with them that you cannot do in software? (asking as a person who had no issues with having 18 virtual displays and no dummies).
One example: I use software called Looking Glass on my PC for interacting with a Windows virtual machine. I have two GPUs in my computer, an AMD one for the Linux host and an NVidia one that gets passed through to the Windows guest. Looking Glass then captures the NVidia GPU's output and displays it in a window on my desktop. This allows me to use Windows software in the VM and get acceptable performance (Windows has basically required graphics acceleration to run acceptably after 7). The problem is that the NVidia GPU will not do anything without having a display connected. NVidia Quadro GPUs support dumping a monitor's EDID and then mapping that file to an output (so the GPU always thinks that monitor is connected to that output), but their consumer-grade GPUs don't support this. That's where the dummy plug comes in.
A lot of OS / GPU / driver combinations dont actually let you setup virtual displays with arbitrary settings. And you might want it for setting streaming with OBS or games streamings via Steam / Parsec / etc.
Some years ago it's kind a worked for me on Linux with Xorg and open source drivers and Windows with Nvidia, but when it comes to MacOS or Windows+AMD or Intel GPU it simply doesn't work that well.
We use it for testing binary embedded Linux distros where tricking the OS to think there's a display connected introduces a new variable that is not present in the user's deployment - and it's a cheap hardware solution. Buying and installing them is probably more cost-effective than having an engineer writing the `echo on > /sys/whatever` and the logic around it.
Dummy plugs are a lot easier for most people. I added a fake 4K monitor to my desktop via software for remote game streaming, and it was a lot more complicated than I expected[^1].
I have a moded chromebox(booting windows and linux), which refuses to boot without any video device attached to hdmi port. So I had to use a dummy plug.
In addition to what's already been mentioned, I remember there being issues with macs not unlocking the full abilities of the GPU if there was no display present. Maybe there is some software workaround, but a HDMI dummy is cheap and quick and won't disable itself on updates etc.
It seems that linux doesn't support virtual displays. On Windows you can either install a dummy display or have Apollo do it automatically. No such thing on linux.
Fun tip: the same process works to modify the edid stored on a typical monitor or laptop screen. Sometimes you can even change various settings on the tcon by writing to other i2c addresses. You also don't need a raspberry pi, any computer works.
I did this once on an Acer monitor. I was modifying it to strobe the LED backlight to give blur-free video like a CRT. I had found that the DDC bus exposed other registers to control the backlight, etc. and had an external circuit connected to do the strobing. I noticed it had registers to read and write the flash ROM, so I dumped it, wrote an 8051 disassembler and did some crude reverse-engineering, then eventually modified the ROM to include a strobed mode (occupying the lower end of the brightness control, so it could enabled and adjusted via the OSD or normal DDC brightness controls). I did have to go inside and connect the write line when flashing it. There was conveniently an interrupt on vblank, and the timer that controlled the LED backlight had a mode that could turn it on a little before the next interrupt and then off an adjustable amount of time after, just the right time needed to flash it on after the LCD had settled updating. Originally I just wanted to remove the several-second boot logo (which I achieved).
The flash chip usually has a write enable/disable pin and most monitors and TVs will wire it to prevent writes to the EDID. I would guess only cheap ones don't bother. It's risky as without protection, a voltage glitch during a read can turn it into a write and trash the flash.
Attaching the write-protect pin to +V is literally free in the PCB design process; IMO not doing so is a design error or decision (though IDK how much thought was placed into allowing users to rewrite the monitor identification).
True, although the i2c controller that the dimms are connected to is an entirely separate device from the i2c controller in the gpu that's connected to the display ports. As long as you know what you're doing the risk is not significant.
Yeah, if you are 100% confident you're using your GPU's I2C controller it's probably fine, but the reason I warned about it repeatedly in the post was because I stumbled upon this GitHub issue where two people accidentally flashed their RAM SPD:
Makes me think of this anecdote from Linus Torvalds' officemate, from (1)
> At one point, Linus had implemented device files in /dev, and wanted to dial up the university computer and debug his terminal emulation code again. So he starts his terminal emulator program and tells it to use /dev/hda. That should have been /dev/ttyS1. Oops. Now his master boot record started with "ATDT" and the university modem pool phone number. I think he implemented permission checking the following day.
Yeah,it shouldn't happen - but I've seen it happen. What's worse, the first batch we got from that place weren't flashed with an EDID at all - and were shipped directly to customers (who mostly didn't notice, because the main product it connected to had default that worked, but it wasn't optimal. Meant none of those screens could be used with a normal laptop though). Ironically the combination of the two issues meant we could have fixed the EDID in the field, but we didn't dare in case we bricked someone's $x000 TV.
Modern monitors don't even use an EEPROM chip for EDID anymore. The I2C bus is hooked up to a microcontroller inside the monitor, which allows it to implement Display Data Channel. This way you can tune things like display brightness and color profile from an application running on the computer, instead of messing around with the monitor's OSD.
Tools like ddcutil aren't very well-known, but they can be quite useful if you want to do something like DIYing a KVM switch by just having the PC tell the monitor to switch to a different input!
> DIYing a KVM switch by just having the PC tell the monitor to switch to a different input!
I made a tiny contribution to the ddcutil-db database when I did exactly that. My monitor wasn't supported initially, but it wasn't hard to use the utils and templates to poke around and find the correct addresses for basic settings and input switching.
It was a nice afternoon's work to get it all working.
Do any monitors use the I2C multi-peripheral feature to allow both DDC communication and an I2C EEPROM to exist at different addresses on the same bus? Or is it cheaper to integrate functionality into a controller chip? (Though DP tunnels EDID over the aux bus, and (I assume) doesn't use an EEPROM to begin with.)
The specification is explicitly designed to allow for it, but I honestly doubt it is very popular - if used at all.
There are two main issues here. The first is that the standard EDID EEPROM is very limited in size, and a lot of monitors need more space. VESA solved this by adding a dummy "segment selector" register, located on a separate I2C address. This makes it incompatible with off-the-shelf I2C EEPROM chips, so you'd need some kind of custom EDID-specific EEPROM chip anyways.
The second issue is that most monitors have multiple input ports. A regular EEPROM chip can only be hooked up to a single port (I2C itself supports it, but the spec forbids it), so you'd need one EEPROM chip per port. That gets expensive quite quickly.
If you're already implementing DDC/CI via some kind of microcontroller, why not have it deal with EDID as well? Heck, you could even give the microcontroller a separate connection to an EEPROM to make it easier to program! The EDID part is absolutely trivial, I bet you could implement it in two dozen instructions without even trying very hard. No reason to make it even harder for yourself by keeping it separate.
A friend had to reflash a monitor (Acer K222HQL) with a corrupted EDID over the HDMI port. I confirmed that it has three input ports (VGA, DVI, and HDMI) each with their own EEPROM chip next to the port (the friend had to lift a pin on the HDMI EEPROM to successfully reflash it; she should've connected it to ground but didn't). I found a manual online (https://global-download.acer.com/GDFiles/Document/User%20Man...) saying that the monitor supports DDC, implying it does do the multi-peripheral I2C trick.
I have another broken monitor's mainboard where the VGA and DVI's EDID pins go through 100 ohm resistors to {unpopulated 8-pin footprints, as well as the main chip}. I think this means the design considered saving EDID on dedicated EEPROM chips, but ended up integrating the data on the display receiver instead.
I'm confused. I was expecting to see th plug hooked to the GPIO i2c pins, but just plugging it to the RPi appears to be enough. So, does HDMI directly expose an i2c interface?
Does anyone know of a cheap DisplayPort EDID emulator to fix issues with a KVM and linux? Last time I checked, they were much more expensive than HDMI, to the point where it would be better to buy a new KVM.
The issue here is that DisplayPort doesn't use a basic EEPROM hooked up to an I2C bus for EDID. Instead it uses the high-speed DisplayPort-specific AUX bus, which is significantly more complicated to mess around with. Heck, I don't think you can even find any decent documentation about it without joining VESA and signing a bunch of NDAs.
Minor note for those wanting to try this at home - these cheap dummy plugs only have a 256 byte eeprom, which is not enough for storing the various extended EDID blocks needed to specify high-refresh high-resolution configs. If you just want 1080p60 they're fine, but you won't be able to simulate a 4k240 monitor with them.
Also, some of them have the write-protect line pulled high (or low? don't remember) and you'll need a bit of surgery to actually write to them.
WP high vs low might depend on which chip is used.
One caveat of these dummy plugs is that they don't do HDCP. They handle the typical use case of forcing a specific resolution output for headless machines rather well, but fail for the use case that you need to run something that expects HDCP.
This seems a good place to ask: does anyone know of a good solution like this HDMI dummy plug, but that negotiates HDCP? I need to test video streaming apps that require HDCP to play at full resolution, but it is inconvenient to have a full TV for every test.
The one solution I've found is an HDMI multiviewer, which seems to negotiate HDCP to each port individually.
I use this HDMI splitter. It lets you either set a preprogrammed EDID, or learns an EDID from whatever you plug into HDMI output 1, and then shows up as a connected monitor for as long as the splitter's plugged in without having to connect anything to the outputs. I believe it negotiaties HDCP between the computer/console/whatever and the splitter, then sends the signal to the output monitor without HDCP. https://www.amazon.com/dp/B07VP37KMB
I have a different usecase. I have an embedded system that sends out HDMI. However, its boot screen is something I want to replace by another HDMI stream (a static image would suffice). I specifically don't want to change anything on the embedded system for a thousand reasons I won't go into. How do I cheaply and robustly do this?
Probably not the most elegant solution, but there are many HDMI switchers out there that can be talked to using gpio and/or RS-232. You could use one of those and a Raspi, feed the image in from the Raspi, detect the power of your embedded device via GPIO and then switch over to the other HDMI input. Whether that is a feasible solution mostly depends on other requirements, e.g. regarding power consumption or space.
I used a similar solution once in an AV context and it has been running reliably for years now.
If you want something less hacky Blackmagic has the Atem Mini, a good HDMI switcher that can also store multiple stills and can be switched via ethernet (among other things)
https://www.ti.com/product/TMDS261B
This and an RP2040 to generate the second signal would probably work just fine; see https://learn.adafruit.com/picodvi-arduino-library-video-out... for an example of how to stretch the rp2040 to do HDMI
Aliexpress sells things that claim to terminate hdcp and forward hdmi. Caveat emptor.
I find it crazy that the signal between our monitors and desktop computers is encrypted when these exist.
What's even crazier is that we get the negative of DRM but none of the upside. I have a 4k hdmi 2.0 TV without hdcp 2, so no 4k content without the "splitter". Also any interoperability issue is a "catastrophic" failure (as in at best no content, at worst no hdmi output at all). And yes they do happen, either because of broken software implementation (some TV don't reset their hdcp state machine when switching hdmi source), or just dumb electrical issue (i2c - and cec - have a habit of dying because of leaking charges, and one needs to unplug everything for 10 min to fix it)
That's video DRM for you.
Upside? There is no, and never was, an upside. Not for the user, and not for anyone else. Video DRM literally never worked.
Nonetheless, it exists, and it makes things worse for everyone by existing.
I even occasionally get audio issues on Netflix (AppleTV plugged into Samsung 4k oled tv), which I assume are due to some kind of DRM, though never dug into it. Sometimes when switching inputs (firetv stick / AppleTV stick), or switching content on AppleTV between different apps, the Netflix content audio just stops working. All app UI sounds work correctly, but no audio once you hit play. Toggling the AppleTV audio settings a few times between dolbyAtmos and standard stereo usually brings it back, so I assume it has something to do drm on the audio tracks, but if anyone has other ideas lmk
I used to run into this often with Paramount+. I don't know if it's still an issue or not because I cancelled over it (plus them showing me ads when I pay for premium).
When I worked in music streaming I made these arguments all the time. The labels would agree with us wholeheartedly about how shitty DRM was but said they demanded it because the artists demanded they do "something" to stop the copying.
https://youtu.be/z8K08AcVru0?t=627
I assume it has an upside for whoever invented it since they can sell it to everyone
This is how the world works. If you want to get rich, you can sell something that doesn't work, to rich people who believe it does. That's basically how YCombinator works.
The only direct upside to DRM is for the IP owner.
Yes though for monitors displayport is better anyway and it doesn't do hdcp.
DisplayPort has supported HDCP (as well as its own DRM scheme DPCP) since version 1.1.
I agree that DisplayPort was better for monitors, but HDMI has basically become DisplayPort so these days they're more or less two sides of the same coin. Both use data packets over fixed rate links now.
You may be thinking of HDMI vs DisplayPort over USB-C, because otherwise they couldn't be more different. In any case, HDMI is still heavily patent and royalty encumbered, to the point it is going to be difficult for opensource GPU drivers to support native HDMI 2.1 or higher going on, while DisplayPort is still royalty free.
The situation is so bad Intel has pretty much skipped native HDMI ports in recent chipset graphics to focus on DP only (motherboards can still install off the shelf DP->HDMI converters), while on AMD the newer HDMI features won't be supported at all on Linux.
> You may be thinking of HDMI vs DisplayPort over USB-C, because otherwise they couldn't be more different.
No, I'm referring to how HDMI 2.1 changed basically everything but the connector to become a multi-lane packet based protocol like DisplayPort instead of being a direct descendant of DVI, which itself was basically digital VGA.
I realize the licensing situation is a disaster.
yep, I'm running an hdmi to dp converter to make linux work.
There are others that end up doing the same thing - I believe some of the Decimators do - perhaps this?
https://www.decimator.com/Products/MiniConverters/12G-CROSS/...
Try one of these HDMI splitters that are more or less openly advertised as "HDCP strippers" on Amazon.
I think multiviewer might have been a synonym for that.
I wonder if the chips on these dummy plugs are powerful enough to hack in HDCP support though.
The dummy plugs are literally just a 256-byte eeprom hooked to the I2C lines, there's nothing else inside the shell.
Terminating HDCP is difficult, you’d have to downgrade it to HDCP 1.4 and then have a 1.4 ‘compliant’ (see: device on the end for it to be a dummy monitor. If you need anything newer than HDCP 1.4, it’s likely not possible.
I did a tear down of this Monoprice dongle: https://tomverbeure.github.io/2023/11/26/Monoprice-Blackbird....
It terminates as an HDCP 2.0 endpoint and converts to HDCP 1.4. You’d still need an HDCP 1.4 sink to make it work though.
I'm using the Monoprice multiviewer. It negotiates HDCP without a display attached. Other than being a bit big and expensive, and being unable to strip HDCP, it's a good solution.
I found the same device in generic packaging on AliExpress, but haven't had the chance to order that version, yet.
There are lots of professional SDI converters and such, but they are either $3k+ or "call for price".
That was written by you?
I don't agree with this section:
> The HDCP converter simply announces itself as a final video endpoint… yet still repeats the content to its output port. Without a very expensive HDMI protocol analyzer, we can’t check if the source is tagging the content as type 0 or type 1, but there is no reason now to think that it’s not type 1.
There's no magic in the HDMI protocol that says type 1 vs type 0. Its just another HDCP message over DDC, but it is only sent to repeaters. In this case, since the HDCP Repeater is lying about not being a repeater, it isn't getting sent the StreamID Type information.
You’re probably right.
Great teardown. Can these things remove HDCP altogether? It seems like if it can report that the sink is HDCP2.x then it can do so even if it has no compliance at all right? So that would mean it streams an encrypted stream to something that needs to then still do the decryption? These devices seem like they'd be underpowered to do that in real time at 18 Gb/s.
I assume the silicon can do it, but it’s not exposed to the user, because that would almost certainly be a license violation.
Relatedly, is there a good archive of EDID binaries somewhere, or a better program to make them?
I have an nice programmable EDID emulator plug, and I can clone my monitor and others, but there's some times there's a specific resolution or feature I want to set and I don't have a way to. (Like 8K with DSC, etc)
I'm aware of https://github.com/bsdhw/EDID but it's rather limited in terms of modern monitors.
I've also written by own using https://www.analogway.com/products/aw-edid-editor but getting all the nuances of various modes, and setting preference order, etc, is rather difficult, at least for me :)
I have (too much) experience in EDID editing. My suggestions:
- AW EDID Editor as you mentioned.
- CRU is a Windows-only tool, and will modify the EDID files it dumps from monitors (removing serial/etc. descriptors to make room for detailed resolutions), but will work. It does not run under Wine.
- 010 Hex Editor has an EDID template.
- On Linux you can install wxEDID from Flatpak (IIRC the distribution packages would crash in WxWidgets). I don't think it can create sections though.
- v4l-utils has edid-decode (which can be used as a git diff textconv tool), though this does not help you encode EDID files.
I found that HDMI EDIDs have a CEA extension block while DP EDIDs have a DisplayID extension block. I haven't done any work in multi-page EDIDs with over 256 bytes (and don't know what EEPROM chip you'd use to emulate them, nor the protocol or APIs to read and write them).
https://www.extron.com/product/software/edidmanager30
https://www1.kramerav.com/au/product/edid%20designer
Both are free but not free to distribute. The Extron one may require you to be working on projects / orgs using their hardware.
I had a similar problem recently: I got this cheap 5.1 surround system soundbar that support up to Dolby TrueHD via HDMI. But here is the catch, it only works with eArc enabled devices (new gen TV). If you plug your PC you need to use SPIDF or aux which hamper the quality. One solution beside buying an audio ectractor/splitter is to fake the PC edid to be reconized as eARc by the soundbar. Still yet working on this, no strict guidelines existing unfortunately.
What programmable EDID emulator plug do you use? I was recently looking into them, and it wasn’t clear what devices had what features, or if any (cheaper) ones were any good.
https://www.store.level1techs.com/products/p/5megt2xqmlryafj...
Ah. Any affordable alternatives? Or just hacking cheap ones.
You can also buy these with a passthrough, which is useful for older systems that balk at higher resolution monitors.
I have a 2011 era AMD FX8350 system where the onboard 880G northbridge+video chipset doesn't output video over HDMI correctly with a 4k display. Hooking up one of these inline to tell it to send a 1080p image works great, which the monitor does a 2x integer upscale to 4k.
I also have a couple of passthroughs -- I probably should have mentioned them in the post as another option. The one I have is fancy -- it can read the EDID from a monitor, save it, and use it as an override for another monitor.
Another awesome thing is it can force the monitor to always be detected. One of my monitors virtually unplugs itself when I shut it off, which causes a bunch of issues for me, and the passthrough completely solved it. The one I use is the HD-EWB by THWT.
Doug, thanks for mentioning this. I didn’t realize pass-through ones existed. I’ll check out the one you have. Nice article btw.
Thank you! Hopefully it works for you.
I guess I should reword the way I said something in the previous message: when I said "it can force the monitor to always be detected", I really should have said "it forces the monitor to always be detected".
hdmi dummy plug is a hardware solution to solve a software problem that should not exist in first place at all.
This sounds like a description of a huge number of extant hardware solutions, to me.
Why are dummy plugs a thing? What can you do with them that you cannot do in software? (asking as a person who had no issues with having 18 virtual displays and no dummies).
One example: I use software called Looking Glass on my PC for interacting with a Windows virtual machine. I have two GPUs in my computer, an AMD one for the Linux host and an NVidia one that gets passed through to the Windows guest. Looking Glass then captures the NVidia GPU's output and displays it in a window on my desktop. This allows me to use Windows software in the VM and get acceptable performance (Windows has basically required graphics acceleration to run acceptably after 7). The problem is that the NVidia GPU will not do anything without having a display connected. NVidia Quadro GPUs support dumping a monitor's EDID and then mapping that file to an output (so the GPU always thinks that monitor is connected to that output), but their consumer-grade GPUs don't support this. That's where the dummy plug comes in.
They make it super simple for someone on the move to do a zoom out teams call with one screen and still have access to PowerPoint’s presenter view.
Basically use the dummy plug screen for PowerPoint’s output and the laptop screen for the presenter notes. Then share the dummy plug’s screen.
Might not be the best answer for the citizens of Hacker News but so, so easy for teachers and salespeople.
A lot of OS / GPU / driver combinations dont actually let you setup virtual displays with arbitrary settings. And you might want it for setting streaming with OBS or games streamings via Steam / Parsec / etc.
Some years ago it's kind a worked for me on Linux with Xorg and open source drivers and Windows with Nvidia, but when it comes to MacOS or Windows+AMD or Intel GPU it simply doesn't work that well.
We use it for testing binary embedded Linux distros where tricking the OS to think there's a display connected introduces a new variable that is not present in the user's deployment - and it's a cheap hardware solution. Buying and installing them is probably more cost-effective than having an engineer writing the `echo on > /sys/whatever` and the logic around it.
Dummy plugs are a lot easier for most people. I added a fake 4K monitor to my desktop via software for remote game streaming, and it was a lot more complicated than I expected[^1].
[^1]: https://pfy.ch/programming/4k-sunshine.html
What gpu/driver were you using?
I was using an AMD 5700xt at the time with Mesa drivers
I have a moded chromebox(booting windows and linux), which refuses to boot without any video device attached to hdmi port. So I had to use a dummy plug.
In addition to what's already been mentioned, I remember there being issues with macs not unlocking the full abilities of the GPU if there was no display present. Maybe there is some software workaround, but a HDMI dummy is cheap and quick and won't disable itself on updates etc.
Raspberry Pi’s with remote desktop won’t render the desktop unless a monitor is physically plugged in… easiest solution for say a PhD student.
Convince a locked-down Fuck You Inc (I mean Apple) device to use a specific display resolution over VNC.
> What can you do with them that you cannot do in software?
It lets a macbook operate with the lid closed.
Presumably it’s for devices which do not have easily modifiable software.
It seems that linux doesn't support virtual displays. On Windows you can either install a dummy display or have Apollo do it automatically. No such thing on linux.
Fun tip: the same process works to modify the edid stored on a typical monitor or laptop screen. Sometimes you can even change various settings on the tcon by writing to other i2c addresses. You also don't need a raspberry pi, any computer works.
I did this once on an Acer monitor. I was modifying it to strobe the LED backlight to give blur-free video like a CRT. I had found that the DDC bus exposed other registers to control the backlight, etc. and had an external circuit connected to do the strobing. I noticed it had registers to read and write the flash ROM, so I dumped it, wrote an 8051 disassembler and did some crude reverse-engineering, then eventually modified the ROM to include a strobed mode (occupying the lower end of the brightness control, so it could enabled and adjusted via the OSD or normal DDC brightness controls). I did have to go inside and connect the write line when flashing it. There was conveniently an interrupt on vblank, and the timer that controlled the LED backlight had a mode that could turn it on a little before the next interrupt and then off an adjustable amount of time after, just the right time needed to flash it on after the LCD had settled updating. Originally I just wanted to remove the several-second boot logo (which I achieved).
The flash chip usually has a write enable/disable pin and most monitors and TVs will wire it to prevent writes to the EDID. I would guess only cheap ones don't bother. It's risky as without protection, a voltage glitch during a read can turn it into a write and trash the flash.
Attaching the write-protect pin to +V is literally free in the PCB design process; IMO not doing so is a design error or decision (though IDK how much thought was placed into allowing users to rewrite the monitor identification).
The author recommends using a Pi while noting it's not a requirement
>If you attempt these commands on a PC, it’s possible that you could accidentally flash hardware that isn’t an EDID, like a RAM module’s SPD EEPROM.
True, although the i2c controller that the dimms are connected to is an entirely separate device from the i2c controller in the gpu that's connected to the display ports. As long as you know what you're doing the risk is not significant.
Yeah, if you are 100% confident you're using your GPU's I2C controller it's probably fine, but the reason I warned about it repeatedly in the post was because I stumbled upon this GitHub issue where two people accidentally flashed their RAM SPD:
https://github.com/bulletmark/edid-rw/issues/5
Makes me think of this anecdote from Linus Torvalds' officemate, from (1)
> At one point, Linus had implemented device files in /dev, and wanted to dial up the university computer and debug his terminal emulation code again. So he starts his terminal emulator program and tells it to use /dev/hda. That should have been /dev/ttyS1. Oops. Now his master boot record started with "ATDT" and the university modem pool phone number. I think he implemented permission checking the following day.
1) https://liw.fi/linux-anecdotes/
> the same process works to modify the edid stored on a typical monitor
That would be a strange oversight by the hardware developers.
Typically they would buy pre-programmed EPPROMs and then place it on to a board where the write enable pin is never pulled high.
It would be strange to put an EEPROM into a product like a monitor and leave it writable, but I’ve seen stranger things on shipping hardware.
Yeah,it shouldn't happen - but I've seen it happen. What's worse, the first batch we got from that place weren't flashed with an EDID at all - and were shipped directly to customers (who mostly didn't notice, because the main product it connected to had default that worked, but it wasn't optimal. Meant none of those screens could be used with a normal laptop though). Ironically the combination of the two issues meant we could have fixed the EDID in the field, but we didn't dare in case we bricked someone's $x000 TV.
Modern monitors don't even use an EEPROM chip for EDID anymore. The I2C bus is hooked up to a microcontroller inside the monitor, which allows it to implement Display Data Channel. This way you can tune things like display brightness and color profile from an application running on the computer, instead of messing around with the monitor's OSD.
Tools like ddcutil aren't very well-known, but they can be quite useful if you want to do something like DIYing a KVM switch by just having the PC tell the monitor to switch to a different input!
> DIYing a KVM switch by just having the PC tell the monitor to switch to a different input!
I made a tiny contribution to the ddcutil-db database when I did exactly that. My monitor wasn't supported initially, but it wasn't hard to use the utils and templates to poke around and find the correct addresses for basic settings and input switching.
It was a nice afternoon's work to get it all working.
Do any monitors use the I2C multi-peripheral feature to allow both DDC communication and an I2C EEPROM to exist at different addresses on the same bus? Or is it cheaper to integrate functionality into a controller chip? (Though DP tunnels EDID over the aux bus, and (I assume) doesn't use an EEPROM to begin with.)
The specification is explicitly designed to allow for it, but I honestly doubt it is very popular - if used at all.
There are two main issues here. The first is that the standard EDID EEPROM is very limited in size, and a lot of monitors need more space. VESA solved this by adding a dummy "segment selector" register, located on a separate I2C address. This makes it incompatible with off-the-shelf I2C EEPROM chips, so you'd need some kind of custom EDID-specific EEPROM chip anyways.
The second issue is that most monitors have multiple input ports. A regular EEPROM chip can only be hooked up to a single port (I2C itself supports it, but the spec forbids it), so you'd need one EEPROM chip per port. That gets expensive quite quickly.
If you're already implementing DDC/CI via some kind of microcontroller, why not have it deal with EDID as well? Heck, you could even give the microcontroller a separate connection to an EEPROM to make it easier to program! The EDID part is absolutely trivial, I bet you could implement it in two dozen instructions without even trying very hard. No reason to make it even harder for yourself by keeping it separate.
A friend had to reflash a monitor (Acer K222HQL) with a corrupted EDID over the HDMI port. I confirmed that it has three input ports (VGA, DVI, and HDMI) each with their own EEPROM chip next to the port (the friend had to lift a pin on the HDMI EEPROM to successfully reflash it; she should've connected it to ground but didn't). I found a manual online (https://global-download.acer.com/GDFiles/Document/User%20Man...) saying that the monitor supports DDC, implying it does do the multi-peripheral I2C trick.
I have another broken monitor's mainboard where the VGA and DVI's EDID pins go through 100 ohm resistors to {unpopulated 8-pin footprints, as well as the main chip}. I think this means the design considered saving EDID on dedicated EEPROM chips, but ended up integrating the data on the display receiver instead.
I'm confused. I was expecting to see th plug hooked to the GPIO i2c pins, but just plugging it to the RPi appears to be enough. So, does HDMI directly expose an i2c interface?
Yep -- the DDC link which is the comms part of VGA/HDMI/DVI is basically an i2c link, and that's how the OS queries the EDID in normal operation.
As the article notes, the rpi has an i2c controller wired up to those hdmi port pins, because it needs to be able to read the EDID over the DDC.
Does anyone know of a cheap DisplayPort EDID emulator to fix issues with a KVM and linux? Last time I checked, they were much more expensive than HDMI, to the point where it would be better to buy a new KVM.
The issue here is that DisplayPort doesn't use a basic EEPROM hooked up to an I2C bus for EDID. Instead it uses the high-speed DisplayPort-specific AUX bus, which is significantly more complicated to mess around with. Heck, I don't think you can even find any decent documentation about it without joining VESA and signing a bunch of NDAs.
You can override EDID in the kernel options (https://foosel.net/til/how-to-override-the-edid-data-of-a-mo...), but I don't know if you want to add a virtual monitor (unsure if https://askubuntu.com/questions/453109/add-fake-display-when... works).
Have a look at https://github.com/DisplayLink/evdi and (shameless plug, toy project, be gentle) https://github.com/mlukaszek/evdipp
Can dummy plugs be used to change device fingerprinting
hex dump for the USB ibus2 plug extract is concatenated to EDID
been eyeing this kind of solution
I don't get it - why do you need to spoof having a monitor connected for a Raspberry Pi? Surely they boot up fine regardless?
Did you read the article? The RPI is just being used to reprogram the dummy plug.