Instead of removing the preference bit marking the hardware declared mode
preference, leave it in place and just move the user preferred mode to the
front of the list while marking it with the USERPREF bit which will cause it
to be selected by the initial mode selection code.
A lot of EDID writers apparently end up stuffing centimeters (like the
maximum image size field) into the detailed timings, instead of millimeters.
Some of them only get it wrong in one direction. Also, add a quirk to let
us mark the largest 75hz mode as preferred, which will often be used for
EDID 1.0 CRTs.
xf86RandR12ScreenSetSize must protect calls to EnableDisableFBAccess with
suitable vtSema checks to avoid invoking driver code while the X server is
inactive.
The multi-crtc cursor code in hw/xfree86/modes holds a reference to the
current cursor. This reference must be correctly ref counted so the cursor
is not freed out from underneath this code.
As a result, we can remove the quirks that existed to flip the bits back around
for us. This is not confirmed in all cases due to lack of bugs containing EDID
blocks associated with the quirks, but is likely true.
Set the new randr crtc of the output before the output change notification is
delivered to the clients.
Remove RROutputSetCrtc as it is not really necessary. All we have to do is set
the output's crtc on RRCrtcNotify
When the PreferredMode option is selected in the config file, remove the
M_T_PREFERRED bit from all other preferred modes to force the config file
mode to be selected.
Code that disabled mode detection on disabled outputs would confuse
applications by listing said outputs as connected but without any modes.
This makes the disabled state in the config file affect only the initial
configuration and not subsequent modifications by RandR.
The DDX code was ignoring pending properties for computing when mode setting
was required. This meant that configurations differing only in property
values would not cause the mode to be set.
I made a mistake in some new code using MakeAtom, passing the size of the
string instead of the length of the string. Figuring there might be other
such mistakes, I reviewed the server code and found four bugs of the same
form.
at server startup, and not against the virtual X/Y parameters
as they can change.
This fixes an issue when canGrow is TRUE and modes get dropped
when using the virtual X/Y parameters.
DRI uses a non-screen block/wakeup handler which will be executed after the
screen block handler finishes. To ensure that the rotation block handler is
executed under the DRI lock, dynamically wrap the screen block handler for
rotation.
Leaving devices enabled during server startup can cause problems during the
initial mode setting in the server, especially when they are used for
different purposes by the X server than by the BIOS. Disabling all of them
before any mode setting is attempted provides a stable base upon which the
remaining mode setting operations can be built.
SourceValidate is used exclusively by the software cursor code to pull the
cursor off of the screen before using the screen as a source operand. This
eliminates the software cursor from the frame buffer while painting the
rotated image though. Disabling this function by temporarily setting the
screen function pointer to NULL causes the cursor image to be captured.
(cherry picked from commit 05e1c45ade)
Setting a mode on an unrotated CRTC was causing all of the rotation updates
to be disabled; the loop looking for active rotation wasn't actually looking
at each crtc, it was looking at the modified crtc many times.
(cherry picked from commit 8b217dee3a)
I've managed to solve my own bug (#10545) by applying the following
patch to the xserver.
Please apply.
<Conspiracy mode on>
This monitor is "Vista Certified". I wonder if this is a pure coincidence...
<Conspiracy mode off>
With kind regards
Erik Andrén
(cherry picked from commit a63704f14a)
Option "Enable" "True" will force the server to enable an output at startup
time, even if the output is not connected. This also causes the default
modes to be added for this output, allowing even sync ranges to be used to
pick out standard modes.
(cherry picked from commit a3d73ba2cb)
By default, use the screen monitor section for output 0, however, a driver
can change which output gets the screen monitor by calling
xf86OutputUseScreenMonitor.
(cherry picked from commit f4a8e54caf)
This Acer monitor reports support for 75hz refresh via EDID, and yet when
that rate is delivered, the monitor does not sync and reports out of range.
Use the existing 60hz quirk for this monitor.
(cherry picked from commit 1328a288e9)
xf86SetSingleMode tries to resize all crtcs to match the selected mode. When
a CRTC has no matching mode, it now disables the CRTC (instead of crashing).
Also, poke the RandR extension when xf86SetSingleMode is done so that
appropriate events can be delivered, and so that future RandR queries return
correct information.
(cherry picked from commit dc6c4f6989)