Coredump Xorg Every Boot

I just noticed that my Xorg has been producing a core dump every boot, since about 2-2.5 hours after my initial install, 1 week ago.

Investigating led me to try the lts kernel. And boom, core dump.

linux 5.10.15.arch1-1
linux-lts 5.4.97-1

I’m not even sure what to post to help figure out why it’s dumping. After the dump, X starts up just fine. I’ve had no usability problems, other than I noticed the dump in journalctl on every boot, and the core dump file.

I’m on a Macbbook Pro 11,x.

What would I do to start investigating the reason? Or should I just forget it and live with it?

[root@gardy lib]# coredumpctl gdb 909
           PID: 909 (Xorg)
           UID: 0 (root)
           GID: 0 (root)
        Signal: 6 (ABRT)
     Timestamp: Sat 2021-02-13 10:05:12 MST (24min ago)
  Command Line: /usr/lib/Xorg -dpi 120 :0 -seat seat0 -auth /run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch
    Executable: /usr/lib/Xorg
 Control Group: /system.slice/lightdm.service
          Unit: lightdm.service
         Slice: system.slice
       Boot ID: 9b61b519a8f64436b5903660ea873306
    Machine ID: 0198156c9c3b42ed83b37e44bf187dd4
      Hostname: gardy
       Storage: /var/lib/systemd/coredump/core.Xorg.0.9b61b519a8f64436b5903660ea873306.909.1613235912000000.zst
       Message: Process 909 (Xorg) of user 0 dumped core.

                Stack trace of thread 909:
                #0  0x00007fea3eb2eef5 raise (libc.so.6 + 0x3cef5)
                #1  0x00007fea3eb18862 abort (libc.so.6 + 0x26862)
                #2  0x000055dceb5026ea OsAbort (Xorg + 0x14a6ea)
                #3  0x000055dceb5041b1 FatalError (Xorg + 0x14c1b1)
                #4  0x000055dceb509e09 n/a (Xorg + 0x151e09)
                #5  0x00007fea3eb2ef80 __restore_rt (libc.so.6 + 0x3cf80)
                #6  0x00007fea3eb2eef5 raise (libc.so.6 + 0x3cef5)
                #7  0x00007fea3eb18862 abort (libc.so.6 + 0x26862)
                #8  0x00007fea3eb18747 __assert_fail_base.cold (libc.so.6 + 0x26747)
                #9  0x00007fea3eb27646 __assert_fail (libc.so.6 + 0x35646)
                #10 0x000055dceb44d870 n/a (Xorg + 0x95870)
                #11 0x00007fea206fffb9 glamor_init (libglamoregl.so + 0xcfb9)
                #12 0x00007fea3de6b0fd n/a (modesetting_drv.so + 0x140fd)
                #13 0x000055dceb43140e AddGPUScreen (Xorg + 0x7940e)
                #14 0x000055dceb53deb9 n/a (Xorg + 0x185eb9)
                #15 0x000055dceb573288 n/a (Xorg + 0x1bb288)
                #16 0x000055dceb57352b n/a (Xorg + 0x1bb52b)
                #17 0x000055dceb523435 InitInput (Xorg + 0x16b435)
                #18 0x000055dceb3f1798 n/a (Xorg + 0x39798)
                #19 0x00007fea3eb19b25 __libc_start_main (libc.so.6 + 0x27b25)
                #20 0x000055dceb3f25de _start (Xorg + 0x3a5de)

Looking through boot logs, no dump was produced on this day at 16:10 when I booted. Then, next boot at 16:33, core dump.

Same kernel. (5.10.14-arch1-1) at that time.

Would it be something else installed in that 23 minute window?
How do I determine what was installed in that window of time?

Wouldn’t it be you yourself, who installed something in that timeframe (if you are the only one, who uses the said computer) :grin:

Nevertheless, you could check that. Just have a look at /var/log/pacman.log

1 Like

Well, I had just installed the OS that day, and I installed a dozen or more applications within a 2 hour window. My memory is ok, but remembering which packages were installed in that 23 minute window a week ago is a bit tough.

But here’s the pacman log for that window of time. I uninstalled acpilight just now and rebooted, still dumped.

[2021-02-08T16:19:41-0700] [PACMAN] Running 'pacman --sync -- extra/pahole community/python-pytz community/python-babel community/python-docutils community/python-imagesize extra/python-markupsafe community/python-jinja community/python-pygments community/python-snowballstemmer community/python-sphinx-alabaster-theme community/python-sphinxcontrib-applehelp community/python-sphinxcontrib-devhelp community/python-sphinxcontrib-htmlhelp community/python-sphinxcontrib-jsmath community/python-sphinxcontrib-qthelp community/python-sphinxcontrib-serializinghtml community/python-sphinx community/python-sphinx_rtd_theme extra/netpbm extra/gts extra/graphviz extra/xmlto'
[2021-02-08T16:19:45-0700] [ALPM] transaction started
[2021-02-08T16:19:45-0700] [ALPM] installed pahole (1.19-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-pytz (2021.1-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-babel (2.9.0-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-docutils (0.16-4)
[2021-02-08T16:19:45-0700] [ALPM] installed python-imagesize (1.2.0-4)
[2021-02-08T16:19:45-0700] [ALPM] installed python-markupsafe (1.1.1-6)
[2021-02-08T16:19:45-0700] [ALPM] installed python-jinja (2.11.3-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-pygments (2.7.4-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-snowballstemmer (2.1.0-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinx-alabaster-theme (0.7.12-6)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinxcontrib-applehelp (1.0.2-3)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinxcontrib-devhelp (1.0.2-3)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinxcontrib-htmlhelp (1.0.3-3)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinxcontrib-jsmath (1.0.1-6)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinxcontrib-qthelp (1.0.3-3)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinxcontrib-serializinghtml (1.1.4-3)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinx (3.4.3-1)
[2021-02-08T16:19:45-0700] [ALPM] installed python-sphinx_rtd_theme (0.5.1-2)
[2021-02-08T16:19:45-0700] [ALPM] installed netpbm (10.73.33-1)
[2021-02-08T16:19:45-0700] [ALPM] installed gts (0.7.6.121130-2)
[2021-02-08T16:19:45-0700] [ALPM] installed graphviz (2.46.0-1)
[2021-02-08T16:19:46-0700] [ALPM] installed xmlto (0.0.28-4)
[2021-02-08T16:19:46-0700] [ALPM] transaction completed
[2021-02-08T16:19:46-0700] [ALPM] running '30-systemd-update.hook'...
[2021-02-08T16:19:46-0700] [ALPM] running 'detect-old-perl-modules.hook'...
[2021-02-08T16:19:46-0700] [PACMAN] Running 'pacman --database --asdeps -- pahole python-pytz python-babel python-docutils python-imagesize python-markupsafe python-jinja python-pygments python-snowballstemmer python-sphinx-alabaster-theme python-sphinxcontrib-applehelp python-sphinxcontrib-devhelp python-sphinxcontrib-htmlhelp python-sphinxcontrib-jsmath python-sphinxcontrib-qthelp python-sphinxcontrib-serializinghtml python-sphinx python-sphinx_rtd_theme netpbm gts graphviz xmlto'
[2021-02-08T16:30:19-0700] [PACMAN] Running 'pacman --sync -- acpilight'
[2021-02-08T16:30:21-0700] [ALPM] transaction started
[2021-02-08T16:30:21-0700] [ALPM] installed acpilight (1.2-2)
[2021-02-08T16:30:21-0700] [ALPM-SCRIPTLET]   The installed udev rules expect your user to be in the video group to modify
[2021-02-08T16:30:21-0700] [ALPM-SCRIPTLET]   the relevant files in the /sys hierarchy.
[2021-02-08T16:30:21-0700] [ALPM] transaction completed
[2021-02-08T16:30:21-0700] [ALPM] running '30-systemd-udev-reload.hook'...
[2021-02-08T16:30:21-0700] [ALPM] running '30-systemd-update.hook'...

ah ok, i see…

well, i don’ really have a clue. But it could be of interest which DE you are using. My first thougth would be maybe a graphics driver problem. What GPU do you have/use?

What’s the output of inxi -G (maybe you need to install the package first)

Does it happen at random? Or can it be kind of linked with a certain “action” like starting a specific program?

1 Like

You could check if it is a corrupted file.

journalctl --verify

Also

systemctl status systemd-journald.service

I am seeing a similar issue with the PC I have migrate to endeavourOS yesterday.

During boot I see an xorg coredump.

coredumpctl
TIME                            PID   UID   GID SIG COREFILE  EXE
Sun 2021-03-07 10:31:40 CET     480     0     0   6 present   /usr/lib/Xorg
Sun 2021-03-07 11:44:28 CET     848     0     0   6 present   /usr/lib/Xorg
Sun 2021-03-07 16:00:15 CET     793     0     0   6 present   /usr/lib/Xorg
Sun 2021-03-07 16:03:17 CET     735     0     0   6 present   /usr/lib/Xorg
Mon 2021-03-08 08:26:25 CET     763     0     0   6 present   /usr/lib/Xorg
Mon 2021-03-08 08:30:06 CET     760     0     0   6 present   /usr/lib/Xorg
Mon 2021-03-08 08:32:18 CET     507     0     0   6 present   /usr/lib/Xorg
Mon 2021-03-08 08:53:29 CET     715     0     0   6 present   /usr/lib/Xorg

Example:

coredumpctl gdb 480
22# coredumpctl gdb 480                                             
           PID: 480 (Xorg)
           UID: 0 (root)
           GID: 0 (root)
        Signal: 6 (ABRT)
     Timestamp: Sun 2021-03-07 10:31:40 CET (23h ago)
  Command Line: /usr/lib/Xorg :0 -seat seat0 -auth /run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch
    Executable: /usr/lib/Xorg
 Control Group: /system.slice/lightdm.service
          Unit: lightdm.service
         Slice: system.slice
       Boot ID: b069bbc9d80041d599a2ac85838cedb6
    Machine ID: 86a4558f93854185968861c6dbe3859c
      Hostname: unten
       Storage: /var/lib/systemd/coredump/core.Xorg.0.b069bbc9d80041d599a2ac85838cedb6.480.1615109500000000.zst
       Message: Process 480 (Xorg) of user 0 dumped core.
                
                Stack trace of thread 480:
                #0  0x00007f034e90aef5 raise (libc.so.6 + 0x3cef5)
                #1  0x00007f034e8f4862 abort (libc.so.6 + 0x26862)
                #2  0x0000562a8df966ea OsAbort (Xorg + 0x14a6ea)
                #3  0x0000562a8df981b1 FatalError (Xorg + 0x14c1b1)
                #4  0x0000562a8df9de09 n/a (Xorg + 0x151e09)
                #5  0x00007f034e90af80 __restore_rt (libc.so.6 + 0x3cf80)
                #6  0x00007f034e90aef5 raise (libc.so.6 + 0x3cef5)
                #7  0x00007f034e8f4862 abort (libc.so.6 + 0x26862)
                #8  0x00007f034e8f4747 __assert_fail_base.cold (libc.so.6 + 0x26747)
                #9  0x00007f034e903646 __assert_fail (libc.so.6 + 0x35646)
                #10 0x0000562a8dee1870 n/a (Xorg + 0x95870)
                #11 0x00007f034000cfb9 glamor_init (libglamoregl.so + 0xcfb9)
                #12 0x00007f034db500fd n/a (modesetting_drv.so + 0x140fd)
                #13 0x0000562a8dec540e AddGPUScreen (Xorg + 0x7940e)
                #14 0x0000562a8dfd1eb9 n/a (Xorg + 0x185eb9)
                #15 0x0000562a8e007288 n/a (Xorg + 0x1bb288)
                #16 0x0000562a8e007b12 n/a (Xorg + 0x1bbb12)
                #17 0x0000562a8df96331 n/a (Xorg + 0x14a331)
                #18 0x0000562a8df91c00 WaitForSomething (Xorg + 0x145c00)
                #19 0x0000562a8de85914 n/a (Xorg + 0x39914)
                #20 0x00007f034e8f5b25 __libc_start_main (libc.so.6 + 0x27b25)
                #21 0x0000562a8de865de _start (Xorg + 0x3a5de)
                
                Stack trace of thread 775:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 873:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 876:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 758:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 759:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 770:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 885:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 771:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 764:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 761:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 766:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 773:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 780:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 776:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 879:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 767:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 772:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 871:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 874:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 778:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 768:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 875:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 882:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 872:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 870:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 760:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c434164 n/a (swrast_dri.so + 0x6f0164)
                #3  0x00007f034c41afd8 n/a (swrast_dri.so + 0x6d6fd8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 877:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 769:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 881:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 880:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 878:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 777:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 774:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c41e1a4 n/a (swrast_dri.so + 0x6da1a4)
                #3  0x00007f034c41afa8 n/a (swrast_dri.so + 0x6d6fa8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 884:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 779:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)
                
                Stack trace of thread 883:
                #0  0x00007f034e7c89ba __futex_abstimed_wait_common64 (libpthread.so.0 + 0x159ba)
                #1  0x00007f034e7c2260 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf260)
                #2  0x00007f034c231cec n/a (swrast_dri.so + 0x4edcec)
                #3  0x00007f034c2304c8 n/a (swrast_dri.so + 0x4ec4c8)
                #4  0x00007f034e7bc299 start_thread (libpthread.so.0 + 0x9299)
                #5  0x00007f034e9cd053 __clone (libc.so.6 + 0xff053)

xorg resp. lightdm is up and running anyways. So if you are sitting in front of the PC you do not even notice that there was an xorg coredump. I only found it because I checked the journal.

This happens with linux-lts 5.10.21-1 and linux 5.11.2.arch1-1.

I do not have a dedicated GPU but use the embedded graphics unit of an AMD Ryzen 5 3400G

inxi -F
System:    Host: unten Kernel: 5.10.20-1-lts x86_64 bits: 64 Desktop: Cinnamon 4.8.6 Distro: EndeavourOS 
Machine:   Type: Desktop Mobo: Micro-Star model: B450-A PRO MAX (MS-7B86) v: 4.0 serial: <superuser required> 
           UEFI: American Megatrends LLC. v: M.C0 date: 02/03/2021 
CPU:       Info: Quad Core model: AMD Ryzen 5 3400G with Radeon Vega Graphics bits: 64 type: MT MCP L2 cache: 2 MiB 
           Speed: 1252 MHz min/max: 1400/3700 MHz Core speeds (MHz): 1: 1252 2: 1438 3: 1336 4: 1257 5: 1364 6: 1538 7: 1255 
           8: 1346 
Graphics:  Device-1: Advanced Micro Devices [AMD/ATI] Picasso driver: amdgpu v: kernel 
           Display: x11 server: X.org 1.20.10 driver: loaded: amdgpu,ati unloaded: fbdev,modesetting,vesa 
           resolution: <missing: xdpyinfo> 
           OpenGL: renderer: AMD Radeon Vega 11 Graphics (RAVEN DRM 3.40.0 5.10.20-1-lts LLVM 11.1.0) v: 4.6 Mesa 20.3.4 
Audio:     Device-1: Advanced Micro Devices [AMD/ATI] Raven/Raven2/Fenghuang HDMI/DP Audio driver: snd_hda_intel 
           Device-2: Advanced Micro Devices [AMD] Family 17h HD Audio driver: snd_hda_intel 
           Device-3: MCS FURUTECH ADL GT40α type: USB driver: hid-generic,snd-usb-audio,usbhid 
           Sound Server: ALSA v: k5.10.20-1-lts 
Network:   Device-1: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet driver: r8169 
           IF: enp34s0 state: up speed: 1000 Mbps duplex: full mac: 2c:f0:5d:5d:55:3c 
Drives:    Local Storage: total: 2.73 TiB used: 1.97 TiB (72.1%) 
           ID-1: /dev/sda vendor: Western Digital model: WD20EZRZ-00Z5HB0 size: 1.82 TiB 
           ID-2: /dev/sdb vendor: Samsung model: SSD 850 EVO 1TB size: 931.51 GiB 
Partition: ID-1: / size: 58.27 GiB used: 15.02 GiB (25.8%) fs: xfs dev: /dev/sdb1 
           ID-2: /boot/efi size: 300.4 MiB used: 280 KiB (0.1%) fs: vfat dev: /dev/sdb2 
           ID-3: /home size: 872.49 GiB used: 286.07 GiB (32.8%) fs: xfs dev: /dev/sdb3 
           ID-4: /opt size: 1.82 TiB used: 1.4 TiB (76.7%) fs: xfs dev: /dev/sda1 
Swap:      Alert: No Swap data was found. 
Sensors:   System Temperatures: cpu: 35.4 C mobo: N/A gpu: amdgpu temp: 35.0 C 
           Fan Speeds (RPM): N/A 
Info:      Processes: 299 Uptime: 2h 38m Memory: 60.83 GiB used: 1.84 GiB (3.0%) Shell: Zsh inxi: 3.3.00

Now I have disabled iommu via grub:
GRUB_CMDLINE_LINUX="iommu=off"

That seems to fix it. I have to watch it further.

Did you check sudo dmesg for any other info related to errors?

Edit: Do you have the latest UEFI Bios update for your board? Mine is an MSI and the latest update is perfect. I have no issues and i run with iommu=on controlled by the board in the settings.

The PC has an MSI B450-A PRO MAX with latest M.C0 BIOS.

And I just realized that the xorg coredumps also happen with iommu=off

Have you looked at this?

https://wiki.archlinux.org/index.php/Core_dump#Disabling_automatic_core_dumps

Edit: I have core dumps too if i list them. coredumpctl list

Well, my problem is not the core dump itself. My problem is that xorg crashes and I want to know why.

Check your Ram timings and voltage settings. Are you using XMP?

Example: Instead of 15-15-15-15-36 it may need to be 16-18-18-18-36 and make sure Frequency is correct if it is 3200MHz and voltage 1.350V as an example.

The RAM is ok. memtest86 does not find any issue. It is XMP RAM but instead of the 3200 MHz it is made for I run it at 2933 MHz because that is max for the board.

The system spec on the processor is 2933 but the board does up to 4133 OC so you should be able to set your ram at 3200 if that is what you have.

You are right 2933 is max for the CPU and not for the board.
But no, the PC is not booting when it is faster than 2933. The CPU is the limit here.

The log tells that lightdm.service fails because of Xorg crash. lightdm saves a copy of the xorg log to /var/log/lightdm. This is what it says:

X.Org X Server 1.20.10
X Protocol Version 11, Revision 0
Build Operating System: Linux Arch Linux
Current Operating System: Linux unten 5.10.21-1-lts #1 SMP Sun, 07 Mar 2021 11:56:15 +0000 x86_64
Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux-lts root=UUID=b7131027-6b37-4189-ab20-f956e4c3024e rw iommu=off quiet loglevel=3 nowatchdog
Build Date: 14 December 2020  12:10:29PM
 
Current version of pixman: 0.40.0
        Before reporting problems, check http://wiki.x.org
        to make sure that you have the latest version.
Markers: (--) probed, (**) from config file, (==) default setting,
        (++) from command line, (!!) notice, (II) informational,
        (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(==) Log file: "/var/log/Xorg.0.log", Time: Tue Mar  9 11:36:27 2021
(==) Using config directory: "/etc/X11/xorg.conf.d"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
Xorg: ../xserver/dix/privates.c:384: dixRegisterPrivateKey: Assertion `!global_keys[type].created' failed.
(EE) 
(EE) Backtrace:
(EE) 0: /usr/lib/Xorg (xorg_backtrace+0x53) [0x5576c2860f63]
(EE) 1: /usr/lib/Xorg (0x5576c271a000+0x151da5) [0x5576c286bda5]
(EE) 2: /usr/lib/libc.so.6 (0x7f2034225000+0x3cf80) [0x7f2034261f80]
(EE) 3: /usr/lib/libc.so.6 (gsignal+0x145) [0x7f2034261ef5]
(EE) 4: /usr/lib/libc.so.6 (abort+0x116) [0x7f203424b862]
(EE) 5: /usr/lib/libc.so.6 (0x7f2034225000+0x26747) [0x7f203424b747]
(EE) 6: /usr/lib/libc.so.6 (0x7f2034225000+0x35646) [0x7f203425a646]
(EE) 7: /usr/lib/Xorg (dixRegisterPrivateKey+0x0) [0x5576c27af870]
(EE) 8: /usr/lib/xorg/modules/libglamoregl.so (glamor_init+0xc9) [0x7f1ffc5e7fb9]
(EE) 9: /usr/lib/xorg/modules/drivers/modesetting_drv.so (0x7f203350e000+0x140fd) [0x7f20335220fd]
(EE) 10: /usr/lib/Xorg (AddGPUScreen+0x10e) [0x5576c279340e]
(EE) 11: /usr/lib/Xorg (0x5576c271a000+0x185eb9) [0x5576c289feb9]
(EE) 12: /usr/lib/Xorg (0x5576c271a000+0x1bb288) [0x5576c28d5288]
(EE) 13: /usr/lib/Xorg (0x5576c271a000+0x1bb52b) [0x5576c28d552b]
(EE) 14: /usr/lib/Xorg (InitInput+0xf5) [0x5576c2885435]
(EE) 15: /usr/lib/Xorg (0x5576c271a000+0x39798) [0x5576c2753798]
(EE) 16: /usr/lib/libc.so.6 (__libc_start_main+0xd5) [0x7f203424cb25]
(EE) 17: /usr/lib/Xorg (_start+0x2e) [0x5576c27545de]
(EE) 
(EE) 
Fatal server error:
(EE) Caught signal 6 (Aborted). Server aborting
(EE) 
(EE) 
Please consult the The X.Org Foundation support 
         at http://wiki.x.org
 for help.

This seems to be the culprit:

Xorg: ../xserver/dix/privates.c:384: dixRegisterPrivateKey: Assertion !global_keys[type].created failed.

After that crash lightdm is restarted several times until it succeeds:

lightdm.service: Scheduled restart job, restart counter is at 3.

For me (made a post about this very issue in January) was a combination of NVIDIA drivers and the 5.10 kernel.

And since you run the 5. 10 kernel instead of the latest And t h e LTS kernel also now is 5.10…

Do you run Nvidia drivers? In that case either upgrade to 5.11 or remove the proprietary Nvidia drivers.

I have found an interesting fact:

When X crashes the log files show that X loaded the following modules:

[     5.858] (II) LoadModule: "glx"
[     5.859] (II) LoadModule: "ati"
[     5.859] (II) LoadModule: "radeon"
[     5.859] (II) LoadModule: "modesetting"
[     5.859] (II) LoadModule: "fbdev"
[     5.859] (II) LoadModule: "vesa"
[     5.891] (II) LoadModule: "fbdevhw"
[     5.891] (II) LoadModule: "fb"
[     5.892] (II) LoadModule: "shadow"
[     6.251] (II) LoadModule: "libinput"
[     7.023] (II) LoadModule: "modesetting"
[     7.040] (II) LoadModule: "glamoregl"
[     7.125] (II) LoadModule: "fb"

When X is not crashing I see the following modules in the og:

[     7.919] (II) LoadModule: "glx"
[     7.921] (II) LoadModule: "amdgpu"
[     7.926] (II) LoadModule: "ati"
[     7.931] (II) LoadModule: "modesetting"
[     7.932] (II) LoadModule: "fbdev"
[     7.932] (II) LoadModule: "vesa"
[     7.942] (II) LoadModule: "fbdevhw"
[     7.943] (II) LoadModule: "fb"
[     7.943] (II) LoadModule: "dri2"
[     7.986] (II) LoadModule: "glamoregl"
[     7.998] (II) LoadModule: "ramdac"
[     8.122] (II) LoadModule: "libinput"

The main difference I see is radeon vs. amdgpu. It looks like the first X start is with radeon, that fails and then amdgpu is used.

I blacklisted the radeon module now. Lets see …

I wonder if you create this file

/etc/X11/xorg.conf.d/20-amdgpu.conf

Section "Device"
     Identifier "AMD"
     Driver "amdgpu"
EndSection