Closed Bug 83060 Opened 24 years ago Closed 18 years ago

Bad dpi default for X

Categories

(Core Graveyard :: GFX: Xlib, defect)

x86
Linux
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED WORKSFORME
Future

People

(Reporter: BenB, Assigned: roland.mainz)

References

Details

[Font system]

Per default, the dpi pref is 96dpi. This is wrong for X, which ships with either
100dpi or 75dpi per default, I think (Debian definitely uses the former). We
should use the same default.
This bug is especially bad together with bug 83061 (I can't even set the correct
value in the UI).
X ships with some fonts *designed* for 75 and 100 dpi resolutions respectively.
I doubt you find many monitor running *exactly* at those resolutions though. So
the fonts are normally slightly "wrong" either way, if you compare to the exact
size they should have had in point size if printed on paper.

xdpyinfo here for instance shows 98x104 (running at 1280x1024).

Since a pixel consist of three dots, tuning the resolution in an app for a
number dividable by three makes sense to me, at least. Thus, 72 and 96 dpi are
good choises to display something with that calls itself "75" and "100". It
doesn't encurage blurryness.
 
Hmm not sure if i was clear enough.. What I'm saying is that i think you confuse
font sizes and screen resolution. There is a connection, in that the higher the
screen resolution is, the smaller a font will appear. But the font itself -
whether you call it "75" or "hello" - is just a bitmap.
Actually, specifying any number is wrong. Mozilla should just use the X
server's value. One of the X protocol's features is that I can specify
the DISPLAY variable and point a client at any screen I want (modulo the
usual premission issues). As long as mozilla sits on top of the Xlib
client libraries, it has no say on this.

As for the bitmapped fonts, it's a little more complicated. The 75dpi
and 100dpi fonts are not quite the same. They have slightly different
metrics. It appears that they were not designed at the same time.

In addition, X thinks in terms of points while mozilla thinks in terms
of pixels. the bitmap fonts have both pixel and point (actually
decipoint) values hardcoded into them. Round-off errors have crept in so
the pixel <--> point mapping is not one-to-one. (Bug 40373 deals with
this). If you throw in the fact that 96 is not quite 100, you'll
discover that screen resoultion is more complicated than it should be.
It's the Unix way. :-)
Sure we should ask the X server (bug 81904). But if you give some predefinedf
values, they should be the most common ones. And those are 100 and 75 on Unix,
not 72 and 96.
Can someone tell me ONE single monitor that has an actual resolution of exactly
75dpi or 100dpi under X? I'd be delighted.
Resummarize the requests in this bug:
1. replace 72 and 96 with 75 and 100.
2. Set the default to 0, if not done in bug 83061.
Note that on other platforms, 72 and 96 may be more appropriate. I know 72 is a
Mac-thing and 96 may be a Windows thing.

You will have to demonstrate that having these values slightly off actually
causes a problem before we spend effort writing a platform-specific solution.

Gerv
> You will have to demonstrate that having these values slightly off actually
> causes a problem before we spend effort writing a platform-specific solution.

See ten thumbs' comments in bug 83061.

No, I don't need to demonstrate enything. Common values for X are 100 and 75,
and I must be able to set them. Why not just *add* them, instead of replacing them?
I'm partially affected by this bug, because what Mozilla measures out on my
screen (141 dpi) is rather too large of a value... however, the themes are also
afffected.  Can't we just put this in an "advanced DPI setting" box like I've
described in bug 109830?
Blocks: 109830
The default should be system setting. Eventually someone can probably quickly
hack together a little external helper to let you calibrate it properly. I
briefly wrote how I set mine up:
http://bugzilla.mozilla.org/show_bug.cgi?id=109830#c5
this is not a table specific bug, reassigning to core owner.
Assignee: karnaze → attinasi
Reassigning to Don.
Assignee: attinasi → dcone
Target Milestone: --- → Future
->GFX: Xlib?
Assignee: dcone → Roland.Mainz
Component: Layout → GFX: Xlib
QA Contact: petersen → timeless
Since longer ago than I can remember, in the absence of a
browser.display.screen_resolution override via user.js, a new profile is
automatically set to 0, in order to use the X server DPI. X can be set to use
whatever DPI you please. (see the freshly overhauled
http://www.mozilla.org/unix/dpi.html) X also can serve scalable fonts. Why is
this bug still open?
The default is -1 which means max(96dpi, System setting).

-> WORKSFORME
Status: NEW → RESOLVED
Closed: 18 years ago
Resolution: --- → WORKSFORME
Why 96 and not 100? See above.

I have not noticed this being an actual issue recently, though.
Status: RESOLVED → REOPENED
Resolution: WORKSFORME → ---
Possibly because it's the default on Windows?
I found a post by Felix Miata that provides some background:
http://archivist.incutio.com/viewlist/css-discuss/68515

I also note that CSS 2.1 recommends basing the reference pixel on a 96dpi
device:
http://www.w3.org/TR/CSS21/syndata.html#length-units
Modern distros generally try to set DPI accurately via a query of the display hardware. When it works correctly, DPI will be accurate enough that 1 pt on screen and 1 pt printed will match in size. This results in a modern rule that actual X DPI will be most likely be neither 75 nor 100 nor anything particularly close to either.

Using 75 or 100 is now pretty much obsolete. They were designed for the days when most if not all Linux fonts were bitmap. Now most fonts used by X by default and by web pages are scalable, and scalable fonts at least in theory don't have any particular affinity to particular DPI settings.

IIRC, KHTML does as does Gecko and sets 96 as a floor. This is of no particular import in the browser viewport on most web pages, as most do not set font sizes in pt. However, on those that do, it means the viewport content on Linux will tend to match that found on the average doz system, which is something that makes web designers happy, and protects Linux users from tinier fonts on such pages.

I'm not a Gnome user, but IIRC, Gnome also sets 96 as a configurable floor.

More background on 96: http://blogs.msdn.com/fontblog/archive/2005/11/08/490490.aspx
Given the last comment -> WORKSFORME
Status: REOPENED → RESOLVED
Closed: 18 years ago18 years ago
Resolution: --- → WORKSFORME
Please re-open this bug!

I suffered from it a few days ago with default install of Fedora Core 6 [1], and I was happy when I found that this bug is alrady filed.

Basically, the default for X on an 17" monitor capable of 1600x1200 is, surprise, using the resolution of 1600x1200.  This presents actual resolution of 126dpi. If the programs suppose the resolution of 96dpi, and try to display 10pt fonts, then I get (10pt * 96/126) fonts, i. e. 7.5pt fonts, which are hard to read.

Yes, comment #19 is right, that this makes things compatible with (previous versions of [2]) "doz" system: I was always aware that using higher resolutions makes them unusable, even if you set the infamous "larger fonts" option.

Is this really the goal of the project to emulate every single bug which the users of the "doz" OS had to live with?

Yes, X servers use scalable fonts almost exclusively these days. And most of the font sizes are specified in pts, not pixels, these days.  Consequently, one should be able to use higher resolution on their screen, and things should just work.  Everything should look the same, just more sharp.

Hardwiring 96 dpi here and there makes this impossible, and effectively forces me to use "the resolution" which corresponds to my display size.  It's mildly annoying when I know that my CRT could do better if the software were not so buggy.  But what should do a happy owner of ThinkPad Txx notebook with 14" LCD with resolution 1600x1200?  (The only other option for Txx line was 1024x768 which may not be enough for everyone.)

Felix Miata also wrote in comment #19:
> I'm not a Gnome user, but IIRC, Gnome also sets 96 as a configurable floor.

I work for Red Hat, an Gnome is the default desktop on Fedora.  At least on Fedora Core 6, the default is 96 dpi, and *hardwired*, not floor.  And this is a bug, reported in [3].

"The right thing" is obviously to use the real resolution, which X server computes using the real size of the screen, which the DPMI monitor tells it.  KDE (which I never used) seems to do it this way, facing many why-do-you-look-different bug reports [4].  KDE also went through the trouble of explaining the users that X server should know the resolution, and if it does not, it is the problem of their (auto-)configuration.

Now, when KDE did the pioneering work, and most relevant monitors "just work", the time has come to follow their example.

(The "configurable floor defaulting to 96 dpi" was a good hack at its time, and would be big step forward from the "hardwired 96 dpi" state.  But perhaps we can skip it now.)

Footnotes:
[1] https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=241236
[2] Reportedly, Vista has the 96 dpi bug fixed, and is truly scalable to any resolution.  (Is it really wise to fix the bug on GNU/Linux only when/if Vista gets widespread?)
[3] https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=241232
[4] http://dot.kde.org/1138743149/1138769674/
(In reply to comment #21)
> Basically, the default for X on an 17" monitor capable of 1600x1200 is,
> surprise, using the resolution of 1600x1200.  This presents actual resolution
> of 126dpi. If the programs suppose the resolution of 96dpi, and try to display
> 10pt fonts, then I get (10pt * 96/126) fonts, i. e. 7.5pt fonts, which are hard
> to read.

Gecko floors at 96 DPI. When X uses 126 DPI, Gecko uses 126 DPI.

Anyone using 1600x1200 on a 16" actual 4:3 display is inviting tiny everything, but with Linux, that can be relatively easily compensated for.
 
> Yes, X servers use scalable fonts almost exclusively these days. And most of
> the font sizes are specified in pts, not pixels, these days.

Not just these days. Virtually everywhere else and forever sizes are measured in pt. Gecko is the Maverick, for several reasons. One is that point sizing is more granular. As resolution increases from 72 PPI, the disparity grows from 0 at 72 PPI to 2:1 at 144 PPI. 

> Consequently, one
> should be able to use higher resolution on their screen, and things should just
> work.  Everything should look the same, just more sharp.

That's pretty much how it works in current KDE.

If everything's smaller than you want it to be, you can configure it to be larger without deviating from native or preferred screen resolution. You can do as doz does and fudge the working DPI to a value that  produces more comfortable sizes. (see URL in comment 19). Either via Xft.dpi and/or other methods found at http://www.mozilla.org/unix/dpi.html you can force working X DPI upward by about 1/3, but in any event to whatever level suits your preference.

> "The right thing" is obviously to use the real resolution, which X server
> computes using the real size of the screen, which the DPMI monitor tells it. 

I personally set X to particular semi-arbitrary DPI values according to the nature of whatever testing I am doing. I normally stick to multiples of 12, testing at 72, 84, 96, 108, 120, 132, 144, etc. In the latest versions of xorg, I keep Xft.dpi unset, and use 'Option "NoDDC' and 'DisplaySize' to achieve the desired result at chosen screen resolution. My KDE server desktop runs on a 19.8" display at 1600x1200, which is about 100 DPI, but is set to 144 DPI to keep everything nice and big for my antique eyes.
(In reply to comment #22)
> Gecko floors at 96 DPI. When X uses 126 DPI, Gecko uses 126 DPI.

OK, so the problem is not present in current gecko.
I hope it gets fixed in my firefox soon, too.
(https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=241236)
Product: Core → Core Graveyard
You need to log in before you can comment on or make changes to this bug.