New RRCrtcGetTransform function in DIX that DDX can use to get the pending
transform. The DDX code should be complete; the DIX code is just a stub at
this point.
Drivers that care about crtc positions on the screen to ensure that vblank
works correctly need to be notified when crtcs are changed.
Provide a hook in the mode setting code that is invoked whenever any
configuration is done to the screen.
Use this new hook in the DRI code so that DRI clients are notified and
receive updated information.
Signed-off-by: Keith Packard <keithp@keithp.com>
When a user specifies the position of an output for which no modes exist
(for whatever reason) assume that the width and height of this output
is 0. The result will be the same as if this output isn't taken into
consideration at all and thus should be sane. It will prevent a segfault
when trying to determine the width and height of a non-existent mode.
- Redo damage naming for more consistency.
- Call post submission functions only where appropriate.
- EXA can now live without it's odd damage workarounds.
- Use a single common function to compute reducedness.
- Call it from both the old-school and new-school mode validation paths.
- Define monitor reduced-blanking support in accord with EDID 1.4.
- Attempt to filter RB DMT modes away from the "standard" EDID pool if
the monitor doesn't claim RB support.
On some panels you end up with all of:
- No range descriptor
- No description of physical connectivity
- Native panel size mode in standard timings list
In principle you're supposed to use the timings for that mode from the DMT
spec, but in practice the DMT spec has timings for both 1920x1200 normal
and 1920x1200RB, and the standard timing field gives you no way to
distinguish. And, of course, the non-RB timings don't fit in a single
DVI link.
In the single output enabled case we never enter the loop and test
never gets set and so we fail to match a good mode.
This was causing my 2560x1600 to end up at 2048x1536.
Conflicts:
Xext/xprint.c (removed in master)
config/hal.c
dix/main.c
hw/kdrive/ati/ati_cursor.c (removed in master)
hw/kdrive/i810/i810_cursor.c (removed in master)
hw/xprint/ddxInit.c (removed in master)
xkb/ddxLoad.c
If the monitor isn't reduced-blanking (either through EDID logic, or
config file setting), then remove RB modes from the default pool. Any
RB modes from the driver and config file pools will stick around though;
you asked for them, you got them.
The first guess used to be "is the preferred mode for one output the
preferred mode on all outputs". Instead, do "find the largest mode that's
preferred for at least one output and available on all outputs".
Old logic was just the first one that happened to have an associated
CRTC. The new logic tries to find one that's definitely connected, has
probed modes, and has the largest candidate mode.
LeaveVT/EnterVT cycles will free/realloc shadow frame buffers. Because of
this, the presense/absence of that data is insufficient to know whether
the screen function wrappers are necessary. Instead, the 'transform_in_use'
flag should be used.
This patch also adds 'xf86RotateFreeShadow' for drivers to use at LeaveVT
time to free the rotation data; it will be reallocated on EnterVT.
This patch (and not setting HARDWARE_CURSOR_BIT_ORDER_MSBFIRST on big endian
platforms) fixes it for me with the radeon driver and doesn't break intel.
Correct patch this time :)
Should have done this in the first place. Since we're checking for the absence
of the get_crtc callback in the first place, we'll short circuit the later call
and disable the output, so the ugly "continue" block is unnecesary.
By adding a new output callback, ->get_crtc, xf86SetDesiredModes is able to
avoid turning off outputs & CRTCs if the current output<->CRTC mappings are the
same as the desired configuration. This helps avoid flickering displays at
startup time, which speeds things up a little and looks better.
Unless we check for vtSema before calling into the CRTC and output callbacks,
we may end up trying to access video memory that no longer exists, leading to a
crash. So if we don't have vtSema, return FALSE to the caller, indicating that
we didn't do anything.
Fixes#14444.
Actually more like in the mainline case, where the ideal mode happens to
be the very first aspect match on the first monitor. But let's not
split hairs.
While the ScreenRec's notion of size in millimeters would get updates,
the RANDR 1.1 notion wouldn't, so your screen would appear to be square
and probably at some ludicrous DPI.
i.e., don't check for the end of the list by ->name == NULL, since that
won't work now. Fix the consumers of xf86DefaultModes to use the new
explicit size as well.
In order to report accurate values to users of the RandR property interface,
it's sometimes necessary to ask the driver to update the value (for example
when backlight brightness changes without the server's knowledge, due to hotkey
events or direct sysfs banging).
This patch wires up the core server code with a new xf86CrtcFuncs callback,
get_property, to allow for this.
The new code is available under the RANDR_13_INTERFACE define, which in turn
depends on the RANDR_12_INTERFACE code.
Old heuristic was to find the first monitor that expressed a preference,
then attempt to get all other monitors to agree. This doesn't work
particularly well when the two sets of modes don't precisely intersect,
you get overlapping-but-not-identical output geometry and things go wrong.
New heuristic is:
- Exact user preference, if given
- Exact output preference, if the same for all outputs
- Best (largest) mode of modes common to all outputs:
- with the same aspect ratio as all outputs (may be NULL)
- with 4:3 aspect ratio
- Then the old heuristic to try to get something lit
Note that it is simply not doable to have a reliable initial output guess if
you insist on trying to clone all outputs together. It's far too easy to
end up with displays that simply don't have modes in common. We need to
switch to right-of placement someday, once we're not limited to CRTC size
limits and we have working multi-GPU in RANDR.
If you don't do this, then Modes "800x600" in the Display subsection will
be dutifully ignored and the driver will start at whatever resolution it
feels like.
CVT is enough different from GTF that it should not be used on monitors
that aren't expecting it. This brings us closer to what the spec says
the correct behaviour is.
Before this it was meaningless to try to mark DisplayModeRec tables
const, since the mode name would be emitted as a pointer to an
anonymous string constant, and therefore would have to be fixed up by
ld.so and so couldn't live in .rodata. With this change the standard
mode lists can live in .rodata, and modes duplicated from them will
have their names filled in on the fly.
xf86CrtcRotate() is called by randr 1.2 drivers via xf86CrtcSetMode() or xf86SetDesiredModes()
during ScreenInit() at which point pScrn->pScreen is not set. If a user specifies a rotation
in their config file pScrn->pScreen is dereferenced and boom.
If the originating mode didn't have a name, we would end up with the name of
the original mode being setup correctly, but with the name of the copy still
being NULL.
Instead of removing the preference bit marking the hardware declared mode
preference, leave it in place and just move the user preferred mode to the
front of the list while marking it with the USERPREF bit which will cause it
to be selected by the initial mode selection code.
A lot of EDID writers apparently end up stuffing centimeters (like the
maximum image size field) into the detailed timings, instead of millimeters.
Some of them only get it wrong in one direction. Also, add a quirk to let
us mark the largest 75hz mode as preferred, which will often be used for
EDID 1.0 CRTs.
over to new system.
Need to update documentation and address some remaining vestiges of
old system such as CursorRec structure, fb "offman" structure, and
FontRec privates.
xf86RandR12ScreenSetSize must protect calls to EnableDisableFBAccess with
suitable vtSema checks to avoid invoking driver code while the X server is
inactive.
The multi-crtc cursor code in hw/xfree86/modes holds a reference to the
current cursor. This reference must be correctly ref counted so the cursor
is not freed out from underneath this code.
As a result, we can remove the quirks that existed to flip the bits back around
for us. This is not confirmed in all cases due to lack of bugs containing EDID
blocks associated with the quirks, but is likely true.
Set the new randr crtc of the output before the output change notification is
delivered to the clients.
Remove RROutputSetCrtc as it is not really necessary. All we have to do is set
the output's crtc on RRCrtcNotify
When the PreferredMode option is selected in the config file, remove the
M_T_PREFERRED bit from all other preferred modes to force the config file
mode to be selected.
Code that disabled mode detection on disabled outputs would confuse
applications by listing said outputs as connected but without any modes.
This makes the disabled state in the config file affect only the initial
configuration and not subsequent modifications by RandR.
The DDX code was ignoring pending properties for computing when mode setting
was required. This meant that configurations differing only in property
values would not cause the mode to be set.
I made a mistake in some new code using MakeAtom, passing the size of the
string instead of the length of the string. Figuring there might be other
such mistakes, I reviewed the server code and found four bugs of the same
form.