Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
NVIDIA Optimus: can't get working
View unanswered posts
View posts from last 24 hours

 
Reply to topic    Gentoo Forums Forum Index Kernel & Hardware
View previous topic :: View next topic  
Author Message
r.osmanov
n00b
n00b


Joined: 21 Oct 2011
Posts: 28

PostPosted: Sat Jun 07, 2014 12:39 pm    Post subject: NVIDIA Optimus: can't get working Reply with quote

Hi,

I'm trying to configure X server to run with a couple of video cards:
integrated Intel + discrete NVIDIA (Optimus). Some time ago the proprietary driver had no support for Optimus. At the time we were forced to use Bumblebee or similar utilities. It looks like some driver version 3xx introduced Optimus support. So we don't need Bumblebee anymore, right?

Now I'm trying to configure Optimus thing without Bumblebee. But all the attempts result in a black screen with or without
cursor.

I had read threads regarding NVIDIA Optimus on this forum. I tried most of the mentioned configuratons. None worked for me. Please help me to figure out what's wrong with my configuration.

Code:
$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation GF108M [GeForce GT 630M] (rev a1)


/etc/X11/xorg.conf
Code:

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "nvidia"
    Screen 1 "intel"
    Inactive "intel"
EndSection

Section "Module"
   Load  "glx"
EndSection

Section "Monitor"
   Identifier "Mon0"
   VendorName "unknown"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
   BusID "PCI:1:0:0"
    Option       "ModeDebug" "True"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
   Monitor "Mon0"
    # Uncomment this line if your computer has no display devices connected to
    # the NVIDIA GPU.  Leave it commented if you have display devices
    # connected to the NVIDIA GPU that you would like to use.
    #Option "UseDisplayDevice" "none"
   #Option "AllowEmptyInitialConfiguration"
EndSection

Section "Device"
    Identifier "intel"
    Driver "modesetting"
   BusID "PCI:0:2:0"
EndSection

Section "Screen"
    Identifier "intel"
    Device "intel"
   Monitor "Mon0"
EndSection


xrandr
Code:

$ xrandr
Screen 0: minimum 8 x 8, current 1600 x 900, maximum 32767 x 32767
LVDS1 connected primary 1600x900+0+0 (normal left inverted right x axis y axis) 382mm x 215mm
   1600x900       60.0*+   40.0 
   1024x768       60.0 
   800x600        60.3     56.2 
   640x480        59.9 
VGA1 disconnected (normal left inverted right x axis y axis)
HDMI1 disconnected (normal left inverted right x axis y axis)
DP1 disconnected (normal left inverted right x axis y axis)
VIRTUAL1 disconnected (normal left inverted right x axis y axis)

$ xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x48 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 4 outputs: 5 associated providers: 0 name:Intel

(NVIDIA-0 provider is missing)

~/.xinitrc
Code:

xrandr --setprovideroutputsource Intel NVIDIA-0
xrandr --auto


Xorg.0.log

Quote:

[ 4441.451]
This is a pre-release version of the X server from The X.Org Foundation.
It is not supported in any way.
Bugs may be filed in the bugzilla at http://bugs.freedesktop.org/.
Select the "xorg" product for bugs you find in this release.
Before reporting bugs in pre-release versions please check the
latest version in the X.Org Foundation git repository.
See http://wiki.x.org/wiki/GitPage for git access instructions.
[ 4441.451]
X.Org X Server 1.15.99.903 (1.16.0 RC 3)
Release Date: 2014-06-04
[ 4441.451] X Protocol Version 11, Revision 0
[ 4441.451] Build Operating System: Linux 3.14.5-gentoo-r1 x86_64 Gentoo
[ 4441.451] Current Operating System: Linux pavilion 3.14.5-gentoo-r1 #5 SMP Sat Jun 7 14:16:51 NOVT 2014 x86_64
[ 4441.451] Kernel command line: BOOT_IMAGE=/kernel-3.14.5-gentoo root=/dev/sda13 ro lockd.nlm_udpport=32768 lockd.nlm_tcpport=32768 drm_kms_helper.edid_firmware=edid/1920x1080_clevo_W670SR.bin
[ 4441.451] Build Date: 07 June 2014 03:33:26PM
[ 4441.451]
[ 4441.451] Current version of pixman: 0.32.4
[ 4441.451] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 4441.451] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 4441.451] (==) Log file: "/var/log/Xorg.0.log", Time: Sat Jun 7 17:37:15 2014
[ 4441.451] (==) Using config file: "/etc/X11/xorg.conf"
[ 4441.451] (==) Using config directory: "/etc/X11/xorg.conf.d"
[ 4441.451] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 4441.452] (==) ServerLayout "layout"
[ 4441.452] (**) |-->Screen "nvidia" (0)
[ 4441.452] (**) | |-->Monitor "Mon0"
[ 4441.452] (**) | |-->Device "nvidia"
[ 4441.452] (**) |-->Inactive Device "intel"
[ 4441.452] (==) Automatically adding devices
[ 4441.452] (==) Automatically enabling devices
[ 4441.452] (==) Automatically adding GPU devices
[ 4441.452] (WW) The directory "/usr/share/fonts/TTF/" does not exist.
[ 4441.452] Entry deleted from font path.
[ 4441.452] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
[ 4441.452] Entry deleted from font path.
[ 4441.452] (WW) The directory "/usr/share/fonts/Type1/" does not exist.
[ 4441.452] Entry deleted from font path.
[ 4441.452] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
[ 4441.452] Entry deleted from font path.
[ 4441.452] (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
[ 4441.452] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
[ 4441.452] Entry deleted from font path.
[ 4441.452] (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
[ 4441.452] (==) FontPath set to:
/usr/share/fonts/misc/
[ 4441.452] (==) ModulePath set to "/usr/lib64/xorg/modules"
[ 4441.452] (II) The server relies on udev to provide the list of input devices.
If no devices become available, reconfigure udev or disable AutoAddDevices.
[ 4441.452] (II) Loader magic: 0x803c60
[ 4441.452] (II) Module ABI versions:
[ 4441.452] X.Org ANSI C Emulation: 0.4
[ 4441.452] X.Org Video Driver: 18.0
[ 4441.452] X.Org XInput driver : 21.0
[ 4441.452] X.Org Server Extension : 8.0
[ 4441.452] (II) xfree86: Adding drm device (/dev/dri/card1)
[ 4441.452] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 4441.453] (--) PCI:*(0:0:2:0) 8086:0166:103c:181d rev 9, Mem @ 0xd4000000/4194304, 0xc0000000/268435456, I/O @ 0x00005000/64
[ 4441.453] (--) PCI: (0:1:0:0) 10de:0de9:103c:181d rev 161, Mem @ 0xd2000000/16777216, 0xa0000000/268435456, 0xb0000000/33554432, I/O @ 0x00004000/128, BIOS @ 0x????????/524288
[ 4441.453] (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
[ 4441.453] (II) LoadModule: "glx"
[ 4441.453] (II) Loading /usr/lib64/xorg/modules/extensions/libglx.so
[ 4441.460] (II) Module glx: vendor="NVIDIA Corporation"
[ 4441.460] compiled for 4.0.2, module version = 1.0.0
[ 4441.460] Module class: X.Org Server Extension
[ 4441.460] (II) NVIDIA GLX Module 337.25 Tue May 27 12:21:38 PDT 2014
[ 4441.461] (II) LoadModule: "nvidia"
[ 4441.461] (II) Loading /usr/lib64/xorg/modules/drivers/nvidia_drv.so
[ 4441.461] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 4441.461] compiled for 4.0.2, module version = 1.0.0
[ 4441.461] Module class: X.Org Video Driver
[ 4441.461] (II) LoadModule: "modesetting"
[ 4441.461] (II) Loading /usr/lib64/xorg/modules/drivers/modesetting_drv.so
[ 4441.461] (II) Module modesetting: vendor="X.Org Foundation"
[ 4441.461] compiled for 1.15.99.903, module version = 0.8.1
[ 4441.461] Module class: X.Org Video Driver
[ 4441.461] ABI class: X.Org Video Driver, version 18.0
[ 4441.461] (II) NVIDIA dlloader X Driver 337.25 Tue May 27 12:01:55 PDT 2014
[ 4441.461] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 4441.461] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
[ 4441.461] (++) using VT number 7

[ 4441.464] (II) Loading sub module "fb"
[ 4441.464] (II) LoadModule: "fb"
[ 4441.464] (II) Loading /usr/lib64/xorg/modules/libfb.so
[ 4441.464] (II) Module fb: vendor="X.Org Foundation"
[ 4441.464] compiled for 1.15.99.903, module version = 1.0.0
[ 4441.464] ABI class: X.Org ANSI C Emulation, version 0.4
[ 4441.464] (WW) Unresolved symbol: fbGetGCPrivateKey
[ 4441.464] (II) Loading sub module "wfb"
[ 4441.464] (II) LoadModule: "wfb"
[ 4441.464] (II) Loading /usr/lib64/xorg/modules/libwfb.so
[ 4441.464] (II) Module wfb: vendor="X.Org Foundation"
[ 4441.464] compiled for 1.15.99.903, module version = 1.0.0
[ 4441.464] ABI class: X.Org ANSI C Emulation, version 0.4
[ 4441.464] (II) Loading sub module "ramdac"
[ 4441.464] (II) LoadModule: "ramdac"
[ 4441.464] (II) Module "ramdac" already built-in
[ 4441.464] (II) modesetting(1): using drv /dev/dri/card0
[ 4441.464] (II) modesetting(G0): using drv /dev/dri/card0
[ 4441.464] (EE) Screen 1 deleted because of no matching config section.
[ 4441.464] (II) UnloadModule: "modesetting"
[ 4441.464] (II) NVIDIA(0): Creating default Display subsection in Screen section
"nvidia" for depth/fbbpp 24/32
[ 4441.464] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
[ 4441.464] (==) NVIDIA(0): RGB weight 888
[ 4441.464] (==) NVIDIA(0): Default visual is TrueColor
[ 4441.464] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[ 4441.464] (**) NVIDIA(0): Option "ModeDebug" "True"
[ 4441.465] (**) NVIDIA(0): Enabling 2D acceleration
[ 4441.581] (WW) NVIDIA(0): Unable to read EDID for display device CRT-0
[ 4441.582] (II) NVIDIA(GPU-0): Found DRM driver nvidia-drm (20130102)
[ 4441.582] (II) NVIDIA(0): NVIDIA GPU GeForce GT 630M (GF108) at PCI:1:0:0 (GPU-0)
[ 4441.582] (--) NVIDIA(0): Memory: 2097152 kBytes
[ 4441.582] (--) NVIDIA(0): VideoBIOS: 70.08.a8.00.4f
[ 4441.582] (II) NVIDIA(0): Detected PCI Express Link width: 16X
[ 4441.584] (--) NVIDIA(0): Valid display device(s) on GeForce GT 630M at PCI:1:0:0
[ 4441.584] (--) NVIDIA(0): CRT-0 (boot, connected)
[ 4441.584] (--) NVIDIA(0): CRT-0 Name Aliases:
[ 4441.584] (--) NVIDIA(0): CRT
[ 4441.584] (--) NVIDIA(0): CRT-0
[ 4441.584] (--) NVIDIA(0): DPY-0
[ 4441.584] (--) NVIDIA(0): VGA-0
[ 4441.584] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
[ 4441.584] (--) NVIDIA(GPU-0):
[ 4441.584] (--) NVIDIA(GPU-0): --- EDID for CRT-0 ---
[ 4441.584] (--) NVIDIA(GPU-0):
[ 4441.584] (--) NVIDIA(GPU-0): No EDID Available.
[ 4441.584] (--) NVIDIA(GPU-0):
[ 4441.584] (--) NVIDIA(GPU-0): --- End of EDID for CRT-0 ---
[ 4441.584] (--) NVIDIA(GPU-0):
[ 4441.584] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
[ 4441.584] (**) NVIDIA(0): device CRT-0 (Using EDID frequencies has been enabled on
[ 4441.584] (**) NVIDIA(0): all display devices.)
[ 4441.584] (II) NVIDIA(0): Frequency information for CRT-0:
[ 4441.584] (II) NVIDIA(0): HorizSync : 28.000-55.000 kHz
[ 4441.584] (II) NVIDIA(0): VertRefresh : 43.000-72.000 Hz
[ 4441.584] (II) NVIDIA(0): (HorizSync from Conservative Defaults)
[ 4441.584] (II) NVIDIA(0): (VertRefresh from Conservative Defaults)
[ 4441.584] (II) NVIDIA(GPU-0):
[ 4441.584] (II) NVIDIA(GPU-0): --- Building ModePool for CRT-0 ---

...

[ 4441.592] (II) NVIDIA(GPU-0): --- Done building ModePool for CRT-0 ---
[ 4441.592] (II) NVIDIA(GPU-0):
[ 4441.592] (II) NVIDIA(GPU-0):
[ 4441.592] (II) NVIDIA(GPU-0): --- Modes in ModePool for CRT-0 ---
[ 4441.592] (II) NVIDIA(GPU-0): "nvidia-auto-select" : 1024 x 768 @ 60.0 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "1024x768" : 1024 x 768 @ 60.0 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "1024x768_60" : 1024 x 768 @ 60.0 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "800x600" : 800 x 600 @ 72.2 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "800x600_72" : 800 x 600 @ 72.2 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "800x600_60" : 800 x 600 @ 60.3 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "800x600_56" : 800 x 600 @ 56.2 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "640x480" : 640 x 480 @ 59.9 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "640x480_60" : 640 x 480 @ 59.9 Hz (from: X Server, VESA)
[ 4441.592] (II) NVIDIA(GPU-0): "512x384" : 512 x 384 @ 60.0 Hz DoubleScan (from: X Server)
[ 4441.592] (II) NVIDIA(GPU-0): "512x384d60" : 512 x 384 @ 60.0 Hz DoubleScan (from: X Server)
[ 4441.592] (II) NVIDIA(GPU-0): "400x300" : 400 x 300 @ 72.2 Hz DoubleScan (from: X Server)
[ 4441.592] (II) NVIDIA(GPU-0): "400x300d72" : 400 x 300 @ 72.2 Hz DoubleScan (from: X Server)
[ 4441.592] (II) NVIDIA(GPU-0): "320x240" : 320 x 240 @ 60.1 Hz DoubleScan (from: X Server)
[ 4441.592] (II) NVIDIA(GPU-0): "320x240d60" : 320 x 240 @ 60.1 Hz DoubleScan (from: X Server)
[ 4441.592] (II) NVIDIA(GPU-0): --- End of ModePool for CRT-0: ---
[ 4441.592] (II) NVIDIA(GPU-0):
[ 4441.592] (==) NVIDIA(0):
[ 4441.592] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
[ 4441.592] (==) NVIDIA(0): will be used as the requested mode.
[ 4441.592] (==) NVIDIA(0):
[ 4441.592] (II) NVIDIA(0): Validated MetaModes:
[ 4441.592] (II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 800x600"
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 800x600_60"
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 800x600_56"
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 640x480"
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 512x384"
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 400x300"
[ 4441.592] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: 320x240"
[ 4441.593] (II) NVIDIA(0): Adding implicit MetaMode: "VGA-0: nvidia-auto-select { ViewPortIn = 1024 x 576, ViewPortOut = 1024 x 576 + 0 + 96 }"
[ 4441.594] (WW) NVIDIA(0): Unable to get display device CRT-0's EDID; cannot compute DPI
[ 4441.594] (WW) NVIDIA(0): from CRT-0's EDID.
[ 4441.594] (==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
[ 4441.594] (==) modesetting(G0): Depth 24, (==) framebuffer bpp 32
[ 4441.594] (==) modesetting(G0): RGB weight 888
[ 4441.594] (==) modesetting(G0): Default visual is TrueColor
[ 4441.594] (II) modesetting(G0): ShadowFB: preferred YES, enabled YES
[ 4441.595] (II) modesetting(G0): Output LVDS-1-0 using monitor section Mon0
[ 4441.596] (II) modesetting(G0): Output VGA-1-0 has no monitor section
[ 4441.597] (II) modesetting(G0): Output HDMI-1-0 has no monitor section
[ 4441.597] (II) modesetting(G0): Output DisplayPort-1-0 has no monitor section
[ 4441.597] (II) modesetting(G0): EDID for output LVDS-1-0
[ 4441.597] (II) modesetting(G0): Manufacturer: LGD Model: 27a Serial#: 0
[ 4441.597] (II) modesetting(G0): Year: 2011 Week: 0
[ 4441.597] (II) modesetting(G0): EDID Version: 1.4
[ 4441.597] (II) modesetting(G0): Digital Display Input
[ 4441.597] (II) modesetting(G0): 6 bits per channel
[ 4441.597] (II) modesetting(G0): Digital interface is undefined
[ 4441.597] (II) modesetting(G0): Max Image Size [cm]: horiz.: 38 vert.: 21
[ 4441.597] (II) modesetting(G0): Gamma: 2.20
[ 4441.597] (II) modesetting(G0): No DPMS capabilities specified
[ 4441.597] (II) modesetting(G0): Supported color encodings: RGB 4:4:4 YCrCb 4:4:4
[ 4441.597] (II) modesetting(G0): First detailed timing is preferred mode
[ 4441.597] (II) modesetting(G0): Preferred mode is native pixel format and refresh rate
[ 4441.597] (II) modesetting(G0): redX: 0.616 redY: 0.346 greenX: 0.315 greenY: 0.602
[ 4441.597] (II) modesetting(G0): blueX: 0.152 blueY: 0.110 whiteX: 0.313 whiteY: 0.329
[ 4441.597] (II) modesetting(G0): Manufacturer's mask: 0
[ 4441.597] (II) modesetting(G0): Supported detailed timing:
[ 4441.597] (II) modesetting(G0): clock: 107.8 MHz Image Size: 382 x 215 mm
[ 4441.597] (II) modesetting(G0): h_active: 1600 h_sync: 1648 h_sync_end 1680 h_blank_end 1920 h_border: 0
[ 4441.597] (II) modesetting(G0): v_active: 900 v_sync: 903 v_sync_end 908 v_blanking: 936 v_border: 0
[ 4441.597] (II) modesetting(G0): Supported detailed timing:
[ 4441.597] (II) modesetting(G0): clock: 71.9 MHz Image Size: 382 x 215 mm
[ 4441.597] (II) modesetting(G0): h_active: 1600 h_sync: 1648 h_sync_end 1680 h_blank_end 1920 h_border: 0
[ 4441.597] (II) modesetting(G0): v_active: 900 v_sync: 903 v_sync_end 908 v_blanking: 936 v_border: 0
[ 4441.597] (II) modesetting(G0): Unknown vendor-specific block 2
[ 4441.597] (II) modesetting(G0): EDID (in hex):
[ 4441.597] (II) modesetting(G0): 00ffffffffffff0030e47a0200000000
[ 4441.597] (II) modesetting(G0): 00150104902615780aec159d58509a27
[ 4441.597] (II) modesetting(G0): 1c505400000001010101010101010101
[ 4441.597] (II) modesetting(G0): 0101010101011c2a4040618424303020
[ 4441.597] (II) modesetting(G0): 35007ed710000019131c404061842430
[ 4441.597] (II) modesetting(G0): 302035007ed710000019000000000000
[ 4441.597] (II) modesetting(G0): 00000000000000000000000000000002
[ 4441.597] (II) modesetting(G0): 000c47ff0a3c6426233f640000000061
[ 4441.597] (II) modesetting(G0): Printing probed modes for output LVDS-1-0
[ 4441.597] (II) modesetting(G0): Modeline "1600x900"x60.0 107.80 1600 1648 1680 1920 900 903 908 936 -hsync -vsync (56.1 kHz eP)
[ 4441.597] (II) modesetting(G0): Modeline "1600x900"x40.0 71.87 1600 1648 1680 1920 900 903 908 936 -hsync -vsync (37.4 kHz e)
[ 4441.597] (II) modesetting(G0): Modeline "1024x768"x120.1 133.47 1024 1100 1212 1400 768 768 770 794 doublescan -hsync +vsync (95.3 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "1024x768"x60.0 65.00 1024 1048 1184 1344 768 771 777 806 -hsync -vsync (48.4 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "960x720"x120.0 117.00 960 1024 1128 1300 720 720 722 750 doublescan -hsync +vsync (90.0 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "928x696"x120.1 109.15 928 976 1088 1264 696 696 698 719 doublescan -hsync +vsync (86.4 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "896x672"x120.0 102.40 896 960 1060 1224 672 672 674 697 doublescan -hsync +vsync (83.7 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "800x600"x120.0 81.00 800 832 928 1080 600 600 602 625 doublescan +hsync +vsync (75.0 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "800x600"x60.3 40.00 800 840 968 1056 600 601 605 628 +hsync +vsync (37.9 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "800x600"x56.2 36.00 800 824 896 1024 600 601 603 625 +hsync +vsync (35.2 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "700x525"x120.0 61.00 700 744 820 940 525 526 532 541 doublescan +hsync +vsync (64.9 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "640x512"x120.0 54.00 640 664 720 844 512 512 514 533 doublescan +hsync +vsync (64.0 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "640x480"x120.0 54.00 640 688 744 900 480 480 482 500 doublescan +hsync +vsync (60.0 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "640x480"x59.9 25.18 640 656 752 800 480 490 492 525 -hsync -vsync (31.5 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "512x384"x120.0 32.50 512 524 592 672 384 385 388 403 doublescan -hsync -vsync (48.4 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "400x300"x120.6 20.00 400 420 484 528 300 300 302 314 doublescan +hsync +vsync (37.9 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "400x300"x112.7 18.00 400 412 448 512 300 300 301 312 doublescan +hsync +vsync (35.2 kHz d)
[ 4441.597] (II) modesetting(G0): Modeline "320x240"x120.1 12.59 320 328 376 400 240 245 246 262 doublescan -hsync -vsync (31.5 kHz d)
[ 4441.598] (II) modesetting(G0): EDID for output VGA-1-0
[ 4441.599] (II) modesetting(G0): EDID for output HDMI-1-0
[ 4441.599] (II) modesetting(G0): EDID for output DisplayPort-1-0
[ 4441.599] (II) modesetting(G0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated.
[ 4441.599] (==) modesetting(G0): DPI set to (96, 96)
[ 4441.599] (II) Loading sub module "fb"
[ 4441.599] (II) LoadModule: "fb"
[ 4441.599] (II) Loading /usr/lib64/xorg/modules/libfb.so
[ 4441.599] (II) Module fb: vendor="X.Org Foundation"
[ 4441.599] compiled for 1.15.99.903, module version = 1.0.0
[ 4441.599] ABI class: X.Org ANSI C Emulation, version 0.4
[ 4441.599] (II) Loading sub module "shadow"
[ 4441.599] (II) LoadModule: "shadow"
[ 4441.599] (II) Loading /usr/lib64/xorg/modules/libshadow.so
[ 4441.599] (II) Module shadow: vendor="X.Org Foundation"
[ 4441.599] compiled for 1.15.99.903, module version = 1.1.0
[ 4441.599] ABI class: X.Org ANSI C Emulation, version 0.4
[ 4441.599] (--) Depth 24 pixmap format is 32 bpp
[ 4441.599] (==) modesetting(G0): Backing store enabled
[ 4441.599] (==) modesetting(G0): Silken mouse enabled
[ 4441.599] (II) modesetting(G0): RandR 1.2 enabled, ignore the following RandR disabled message.
[ 4441.600] (==) modesetting(G0): DPMS enabled
[ 4442.018] (II) NVIDIA: Using 3072.00 MB of virtual memory for indirect memory
[ 4442.018] (II) NVIDIA: access.
[ 4442.028] (II) NVIDIA(0): Screen transformation disabled for CRT-0
[ 4442.028] (II) NVIDIA(0): Setting mode "CRT-0:nvidia-auto-select"
[ 4442.115] (==) NVIDIA(0): Disabling shared memory pixmaps
[ 4442.115] (==) NVIDIA(0): Backing store enabled
[ 4442.115] (==) NVIDIA(0): Silken mouse enabled
[ 4442.115] (==) NVIDIA(0): DPMS enabled
[ 4442.115] (II) Loading sub module "dri2"
[ 4442.115] (II) LoadModule: "dri2"
[ 4442.115] (II) Module "dri2" already built-in
[ 4442.116] (II) NVIDIA(0): [DRI2] Setup complete
[ 4442.116] (II) NVIDIA(0): [DRI2] VDPAU driver: nvidia
[ 4442.116] (--) RandR disabled
[ 4442.119] (II) Initializing extension GLX
[ 4442.120] (II) modesetting(G0): Damage tracking initialized
[ 4442.141] (II) config/udev: Adding input device Power Button (/dev/input/event1)
[ 4442.141] (**) Power Button: Applying InputClass "evdev keyboard catchall"
[ 4442.141] (II) LoadModule: "evdev"
[ 4442.141] (II) Loading /usr/lib64/xorg/modules/input/evdev_drv.so
[ 4442.141] (II) Module evdev: vendor="X.Org Foundation"
[ 4442.141] compiled for 1.15.99.903, module version = 2.8.4
[ 4442.141] Module class: X.Org XInput Driver
[ 4442.141] ABI class: X.Org XInput driver, version 21.0
[ 4442.141] (II) Using input driver 'evdev' for 'Power Button'
[ 4442.141] (**) Power Button: always reports core events
[ 4442.141] (**) evdev: Power Button: Device: "/dev/input/event1"
[ 4442.141] (--) evdev: Power Button: Vendor 0 Product 0x1
[ 4442.141] (--) evdev: Power Button: Found keys
[ 4442.141] (II) evdev: Power Button: Configuring as keyboard
[ 4442.141] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/LNXPWRBN:00/input/input1/event1"
[ 4442.141] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD, id 6)
[ 4442.141] (**) Option "xkb_rules" "evdev"
[ 4442.141] (**) Option "xkb_model" "pc104"
[ 4442.141] (**) Option "xkb_layout" "us"
[ 4442.155] (II) config/udev: Adding input device Video Bus (/dev/input/event3)
[ 4442.155] (**) Video Bus: Applying InputClass "evdev keyboard catchall"
[ 4442.155] (II) Using input driver 'evdev' for 'Video Bus'
[ 4442.155] (**) Video Bus: always reports core events
[ 4442.155] (**) evdev: Video Bus: Device: "/dev/input/event3"
[ 4442.155] (--) evdev: Video Bus: Vendor 0 Product 0x6
[ 4442.155] (--) evdev: Video Bus: Found keys
[ 4442.155] (II) evdev: Video Bus: Configuring as keyboard
[ 4442.155] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/device:00/PNP0A08:00/LNXVIDEO:01/input/input3/event3"
[ 4442.155] (II) XINPUT: Adding extended input device "Video Bus" (type: KEYBOARD, id 7)
[ 4442.155] (**) Option "xkb_rules" "evdev"
[ 4442.155] (**) Option "xkb_model" "pc104"
[ 4442.155] (**) Option "xkb_layout" "us"
[ 4442.155] (II) config/udev: Adding input device Video Bus (/dev/input/event2)
[ 4442.155] (**) Video Bus: Applying InputClass "evdev keyboard catchall"
[ 4442.155] (II) Using input driver 'evdev' for 'Video Bus'
[ 4442.155] (**) Video Bus: always reports core events
[ 4442.155] (**) evdev: Video Bus: Device: "/dev/input/event2"
[ 4442.155] (--) evdev: Video Bus: Vendor 0 Product 0x6
[ 4442.155] (--) evdev: Video Bus: Found keys
[ 4442.155] (II) evdev: Video Bus: Configuring as keyboard
[ 4442.155] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/device:00/PNP0A08:00/device:36/LNXVIDEO:00/input/input2/event2"
[ 4442.155] (II) XINPUT: Adding extended input device "Video Bus" (type: KEYBOARD, id 8)
[ 4442.155] (**) Option "xkb_rules" "evdev"
[ 4442.155] (**) Option "xkb_model" "pc104"
[ 4442.155] (**) Option "xkb_layout" "us"
[ 4442.155] (II) config/udev: Adding input device Lid Switch (/dev/input/event0)
[ 4442.155] (II) No input driver specified, ignoring this device.
[ 4442.155] (II) This device may have been added with another device file.
[ 4442.156] (II) config/udev: Adding input device Logitech Unifying Device. Wireless PID:101a (/dev/input/event6)
[ 4442.156] (**) Logitech Unifying Device. Wireless PID:101a: Applying InputClass "evdev pointer catchall"
[ 4442.156] (II) Using input driver 'evdev' for 'Logitech Unifying Device. Wireless PID:101a'
[ 4442.156] (**) Logitech Unifying Device. Wireless PID:101a: always reports core events
[ 4442.156] (**) evdev: Logitech Unifying Device. Wireless PID:101a: Device: "/dev/input/event6"
[ 4442.156] (--) evdev: Logitech Unifying Device. Wireless PID:101a: Vendor 0x46d Product 0xc52b
[ 4442.156] (--) evdev: Logitech Unifying Device. Wireless PID:101a: Found 20 mouse buttons
[ 4442.156] (--) evdev: Logitech Unifying Device. Wireless PID:101a: Found scroll wheel(s)
[ 4442.156] (--) evdev: Logitech Unifying Device. Wireless PID:101a: Found relative axes
[ 4442.156] (--) evdev: Logitech Unifying Device. Wireless PID:101a: Found x and y relative axes
[ 4442.156] (II) evdev: Logitech Unifying Device. Wireless PID:101a: Configuring as mouse
[ 4442.156] (II) evdev: Logitech Unifying Device. Wireless PID:101a: Adding scrollwheel support
[ 4442.156] (**) evdev: Logitech Unifying Device. Wireless PID:101a: YAxisMapping: buttons 4 and 5
[ 4442.156] (**) evdev: Logitech Unifying Device. Wireless PID:101a: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
[ 4442.156] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.0/usb1/1-1/1-1.2/1-1.2:1.2/0003:046D:C52B.0003/0003:046D:C52B.0004/input/input8/event6"
[ 4442.156] (II) XINPUT: Adding extended input device "Logitech Unifying Device. Wireless PID:101a" (type: MOUSE, id 9)
[ 4442.156] (II) evdev: Logitech Unifying Device. Wireless PID:101a: initialized for relative axes.
[ 4442.156] (**) Logitech Unifying Device. Wireless PID:101a: (accel) keeping acceleration scheme 1
[ 4442.156] (**) Logitech Unifying Device. Wireless PID:101a: (accel) acceleration profile 0
[ 4442.156] (**) Logitech Unifying Device. Wireless PID:101a: (accel) acceleration factor: 2.000
[ 4442.156] (**) Logitech Unifying Device. Wireless PID:101a: (accel) acceleration threshold: 4
[ 4442.156] (II) config/udev: Adding input device Logitech Unifying Device. Wireless PID:101a (/dev/input/mouse0)
[ 4442.156] (II) No input driver specified, ignoring this device.
[ 4442.156] (II) This device may have been added with another device file.
[ 4442.156] (II) config/udev: Adding input device AT Translated Set 2 keyboard (/dev/input/event4)
[ 4442.156] (**) AT Translated Set 2 keyboard: Applying InputClass "evdev keyboard catchall"
[ 4442.156] (II) Using input driver 'evdev' for 'AT Translated Set 2 keyboard'
[ 4442.156] (**) AT Translated Set 2 keyboard: always reports core events
[ 4442.156] (**) evdev: AT Translated Set 2 keyboard: Device: "/dev/input/event4"
[ 4442.156] (--) evdev: AT Translated Set 2 keyboard: Vendor 0x1 Product 0x1
[ 4442.156] (--) evdev: AT Translated Set 2 keyboard: Found keys
[ 4442.156] (II) evdev: AT Translated Set 2 keyboard: Configuring as keyboard
[ 4442.156] (**) Option "config_info" "udev:/sys/devices/platform/i8042/serio0/input/input4/event4"
[ 4442.156] (II) XINPUT: Adding extended input device "AT Translated Set 2 keyboard" (type: KEYBOARD, id 10)
[ 4442.156] (**) Option "xkb_rules" "evdev"
[ 4442.156] (**) Option "xkb_model" "pc104"
[ 4442.156] (**) Option "xkb_layout" "us"
[ 4442.157] (II) config/udev: Adding input device SynPS/2 Synaptics TouchPad (/dev/input/event7)
[ 4442.157] (**) SynPS/2 Synaptics TouchPad: Applying InputClass "evdev touchpad catchall"
[ 4442.157] (**) SynPS/2 Synaptics TouchPad: Applying InputClass "touchpad catchall"
[ 4442.157] (**) SynPS/2 Synaptics TouchPad: Applying InputClass "Default clickpad buttons"
[ 4442.157] (II) LoadModule: "synaptics"
[ 4442.157] (II) Loading /usr/lib64/xorg/modules/input/synaptics_drv.so
[ 4442.157] (II) Module synaptics: vendor="X.Org Foundation"
[ 4442.157] compiled for 1.15.99.903, module version = 1.7.6
[ 4442.157] Module class: X.Org XInput Driver
[ 4442.157] ABI class: X.Org XInput driver, version 21.0
[ 4442.157] (II) Using input driver 'synaptics' for 'SynPS/2 Synaptics TouchPad'
[ 4442.157] (**) SynPS/2 Synaptics TouchPad: always reports core events
[ 4442.157] (**) Option "Device" "/dev/input/event7"
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: x-axis range 1472 - 5666 (res 43)
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: y-axis range 1408 - 4772 (res 72)
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: pressure range 0 - 255
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: finger width range 0 - 15
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: buttons: left right double triple
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: Vendor 0x2 Product 0x7
[ 4442.177] (--) synaptics: SynPS/2 Synaptics TouchPad: touchpad found
[ 4442.177] (**) SynPS/2 Synaptics TouchPad: always reports core events
[ 4442.190] (**) Option "config_info" "udev:/sys/devices/platform/i8042/serio1/input/input6/event7"
[ 4442.190] (II) XINPUT: Adding extended input device "SynPS/2 Synaptics TouchPad" (type: TOUCHPAD, id 11)
[ 4442.190] (**) synaptics: SynPS/2 Synaptics TouchPad: (accel) MinSpeed is now constant deceleration 2.5
[ 4442.190] (**) synaptics: SynPS/2 Synaptics TouchPad: (accel) MaxSpeed is now 1.75
[ 4442.190] (**) synaptics: SynPS/2 Synaptics TouchPad: (accel) AccelFactor is now 0.037
[ 4442.190] (**) SynPS/2 Synaptics TouchPad: (accel) keeping acceleration scheme 1
[ 4442.190] (**) SynPS/2 Synaptics TouchPad: (accel) acceleration profile 1
[ 4442.190] (**) SynPS/2 Synaptics TouchPad: (accel) acceleration factor: 2.000
[ 4442.190] (**) SynPS/2 Synaptics TouchPad: (accel) acceleration threshold: 4
[ 4442.190] (--) synaptics: SynPS/2 Synaptics TouchPad: touchpad found
[ 4442.190] (II) config/udev: Adding input device SynPS/2 Synaptics TouchPad (/dev/input/mouse1)
[ 4442.190] (**) SynPS/2 Synaptics TouchPad: Ignoring device from InputClass "touchpad ignore duplicates"
[ 4442.190] (II) config/udev: Adding input device ST LIS3LV02DL Accelerometer (/dev/input/event5)
[ 4442.190] (II) No input driver specified, ignoring this device.
[ 4442.190] (II) This device may have been added with another device file.
[ 4442.190] (II) config/udev: Adding input device HP WMI hotkeys (/dev/input/event8)
[ 4442.190] (**) HP WMI hotkeys: Applying InputClass "evdev keyboard catchall"
[ 4442.190] (II) Using input driver 'evdev' for 'HP WMI hotkeys'
[ 4442.190] (**) HP WMI hotkeys: always reports core events
[ 4442.190] (**) evdev: HP WMI hotkeys: Device: "/dev/input/event8"
[ 4442.190] (--) evdev: HP WMI hotkeys: Vendor 0 Product 0
[ 4442.190] (--) evdev: HP WMI hotkeys: Found keys
[ 4442.190] (II) evdev: HP WMI hotkeys: Configuring as keyboard
[ 4442.190] (**) Option "config_info" "udev:/sys/devices/virtual/input/input9/event8"
[ 4442.190] (II) XINPUT: Adding extended input device "HP WMI hotkeys" (type: KEYBOARD, id 12)
[ 4442.190] (**) Option "xkb_rules" "evdev"
[ 4442.190] (**) Option "xkb_model" "pc104"
[ 4442.190] (**) Option "xkb_layout" "us"


Of course, `glxgears` fails:
Code:

$ glxgears
Xlib:  extension "GLX" missing on display ":0".
Error: couldn't get an RGB, Double-buffered visual


Besides, I don't understand whether NVIDIA truly supports Optimus on Linux. I
can't find an exact statement that NVIDIA does support the full-fledged Optimus on
Linux, that the proprietary driver doesn't need extra configuration or even extra
packages like Bumblebee.

Please help!
Back to top
View user's profile Send private message
Jaglover
Watchman
Watchman


Joined: 29 May 2005
Posts: 7093
Location: Saint Amant, Acadiana

PostPosted: Sat Jun 07, 2014 3:45 pm    Post subject: Reply with quote

Did you see /usr/share/doc/nvidia-drivers-337.25/README.bz2, chapter 18.
_________________
Please learn how to denote units correctly!
Back to top
View user's profile Send private message
r.osmanov
n00b
n00b


Joined: 21 Oct 2011
Posts: 28

PostPosted: Sat Jun 07, 2014 4:22 pm    Post subject: Reply with quote

Jaglover wrote:
Did you see /usr/share/doc/nvidia-drivers-337.25/README.bz2, chapter 18.


Yes, I read it. In my understanding the chapter informs the user that some happy laptop users can take advantage of the Optimus technology.

Particularly, people with laptops having a hardware multiplexer (which connects NVIDIA GPU with the laptop display panel) are happy. Because
they have an option to switch between the video cards manually. Well, I'm not lucky, because I don't have such a "mux".

Then I see the following:
Quote:
On muxless Optimus laptops, or on laptops where a mux is present, but not set
to drive the internal display from the NVIDIA GPU, the internal display is
driven by the integrated GPU
. On these systems, it's important that the X
server not be configured to use the NVIDIA X driver after the driver is
installed. Instead, the correct driver for the integrated GPU should be used.


I don't understand what "internal display" is. Maybe some virtual display which can be used by CUDA apps, something that is not visible to me.
Then I skip this sentence. Now I'm trying to configure each GPU to use its own driver: intel-intel, nvidia-nvidia:
*/etc/portage/make.conf*
Code:
VIDEO_CARDS="intel nvidia"

and rebuild related packages:
Code:
nvidia-drivers, @x11-module-rebuild, $(eix -# -I emul | xargs)

(I may be wrong, I don't know what to do exactly!)

Next lines of the docs:
Quote:
Often, this can be determined automatically by the X server, and no explicit
configuration is required, especially on newer X server versions. If your X
server autoselects the NVIDIA X driver after installation, you may need to
explicitly select the driver for your integrated GPU.

Well, no-configuration didn't work for me, so I proceeded with custom X server configuration, that is /etc/X11/xorg.conf posted above.
The X server doesn't select NVIDIA X driver (and that is the actual issue); it selects "intel". In the xorg.conf I try to select "nvidia" driver for the discrete card explicitly.

The next lines are more clear. They refer to Chapter 33, which describes how to modify /etc/X11/xorg.conf. So I followed these instructions and modified the xorg.conf file
and the .xinitrc file. However, the commands mentioned in this chapter didn't work for me:
Quote:

Code:
$ xrandr --setprovideroutputsource modesetting NVIDIA-0
$ xrandr --auto



because I have the only xrandr provider called "Intel":
Code:

 xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x48 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 4 outputs: 5 associated providers: 0 name:Intel


Now the next sentence comes very "handy":
Quote:
If either provider is missing
or doesn't have the expected capability, check your system configuration.


Thus, I'm here on the Gentoo forum asking for help.

Thanks.
Back to top
View user's profile Send private message
Dr.Willy
Guru
Guru


Joined: 15 Jul 2007
Posts: 492
Location: NRW, Germany

PostPosted: Sat Jun 07, 2014 11:48 pm    Post subject: Reply with quote

r.osmanov wrote:
*/etc/portage/make.conf*
Code:
VIDEO_CARDS="intel nvidia"

Mhh on my Optimus setup, I have
Code:
VIDEO_CARDS="intel modesetting nvidia"
Back to top
View user's profile Send private message
dachschaden
n00b
n00b


Joined: 07 Jun 2014
Posts: 65
Location: Germany

PostPosted: Sun Jun 08, 2014 5:44 pm    Post subject: Reply with quote

@Dr. Willy: For what earthly reason did you enable the modesetting driver? If my memory serves right the intel driver provices KMS (kernel mode setting), so you don't have to rely on user mode setting.

@r.osmanov: Without bumblebee? Well, I don't want to act like a ultra 1337 hacker, but I tried for several month to teach my Gentoo to use the nVidia card as off-loader. In the end I ended up with bumblebee. :)

It was a 635M, nearly the same model as yours. The problem is, as you already stated, that the muxer is missing, so the intel chip drives the nVidia chip. The "battleship" (as I am used to call the dedicated card) is not directly connected to the monitor, so even if you'd provide the EDID (x11-misc/read-edid is your friend here, it provides a tool named get-edid to connect to your Display via DDC to do what Xorg normally does - getting the EDID), it still would not know how to "speak" with your display.

Bumblebee works around that problem by creating a secondary X server which is running inside your first X server (which is rendered by your Intel chip). The second X server is then able to use the nVidia chip, and since your Intel chip is running, the nVidia chip has a connection to your display and renders everything just fine.

In my opinion it's not worth the trouble, though. I once tried to run a PS1 emulator in Gentoo using bumblebee. Not only did the CPU usage occupied one core (while in Windows it was just 50% - there is something kaput with rendering in general in Linux, I never saw it running as smooth as in Windows), but also the Server seemed to crash each time I wanted to access the menu for load/save a state or reconfigure settings - apparently some problem with non-3D-rendering. Apart to that, I was no able to tell the card to use vsync, so it'd render 300 frames in 1 second while 60 would have been sufficient. I had spent three weeks before I have given up. ;)
(And don't get me started about the broken ALSA plugin, so I had to rely on the OSS plugin, which had a bug in accessing the wrong audio file, so I had to change the plugin's source and then recompile it. And of course the makefile wasn't suit for 64 bit operating systems ...)

I know it's hard to being told that you should just "give up for now" - hell, I am persistent as hell and wouldn't listen to anyone who'd tell me to give up. But I'd have saved some time if I did so.
The Linux, Nouveau and Wayland guys are currently working on fixing the broken graphic stack on Linux, but all that takes time. Unless you are a kernel hacker, I'd suggest you just relax and let them do their work until it got better.
Back to top
View user's profile Send private message
r.osmanov
n00b
n00b


Joined: 21 Oct 2011
Posts: 28

PostPosted: Mon Jun 09, 2014 5:43 pm    Post subject: Reply with quote

dachschaden, thanks for sharing experience. I've almost gave up myself. I will likely be forced to use Bumblebee again.

I made some progress with the following /etc/X11/xorg.conf.
Code:

Section "ServerFlags"
    Option   "AutoAddDevices" "true"
    Option   "AllowEmptyInput" "no"
EndSection

Section "Monitor"
   Identifier     "Monitor0"
   VendorName     "Unknown"
   ModelName      "Unknown"
   HorizSync       28.0 - 73.0
   VertRefresh     43.0 - 72.0
   Option         "DPMS"
   #Modeline       "1600x1200"  161.00  1600 1712 1880 2160  1200 1203 1207 1245 -hsync +vsync
EndSection

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "nvidia"
    Inactive "intel"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID          "PCI:1:0:0"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
    Monitor        "Monitor0"
    Option "AllowEmptyInitialConfiguration"
EndSection

Section "Device"
    Identifier "intel"
    Driver "modesetting"
    BusID          "PCI:0:2:0"
    VendorName     "onboard"
EndSection

Section "Screen"
    Identifier "intel"
    Device "intel"
    Monitor        "Monitor0"
EndSection


It still shows up the black screen. However, xrandr now recognizes NVIDIA-0 provider on display :0:

Code:
$ xrandr --listproviders -d :0
Providers: number : 2
Provider 0: id: 0x2ac cap: 0x1, Source Output crtcs: 2 outputs: 1 associated providers: 0 name:NVIDIA-0
Provider 1: id: 0x46 cap: 0x2, Sink Output crtcs: 3 outputs: 4 associated providers: 0 name:modesetting

(launched in separate VT)

Now it looks like I need to set up xrandr with a couple of commands mentioned in Chapter 33 (/usr/share/doc/nvidia-drivers-337.25/README.bz2). At this point I try to run the commands at the same VT:

Code:
$ xrandr --setprovideroutputsource modesetting NVIDIA-0 -d :0
XIO:  fatal IO error 11 (Resource temporarily unavailable) on X server ":0"
      after 17 requests (17 known processed) with 0 events remaining.


Segmentation fault. Apparently this is some kind of developer error. I don't feel like hacking the code right now. My guess is that something goes wrong because of the different session.

Well, README.bz2 suggests us to put these commands into ~/.xinitrc
Code:

exec ck-launch-session dbus-launch mate-session
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto

As it might have been guessed, the commands below exec ... mate-session are not executed. That's what I got so far. Let me ask a stupid question: how do I fix it? :)
Back to top
View user's profile Send private message
dachschaden
n00b
n00b


Joined: 07 Jun 2014
Posts: 65
Location: Germany

PostPosted: Mon Jun 09, 2014 9:29 pm    Post subject: Reply with quote

I'd love to fiddle with my nVidia card and maybe get it running without bumblebee, I really would.
The problem is that my current card is not even supported by the nouveau driver - 3.15.0 should have fixed that since I am using a maxwell chip, but:

1. I'd need a firmware blob for this card first, and they do not ship it yet.
2. DRI does not seem to work, even the slightest access to /dev/dri/ will cause my kernel to hang.
And I don't really want to switch to the nvidia driver because of the mess with the OpenGL libs. There's eselect, but that didn't work last time, at least not for me. :(

So it's a bit problematic to use your provided config and check it out. I don't have the same hardware since back then.

BUT: What wonders me is that line about modesetting. I do not remember giving something like that - neither in the naive attempt to let the XServer just handle the device, nor when I used bumblebee.

r.osmanov wrote:
As it might have been guessed, the commands below exec ... mate-session are not executed. That's what I got so far. Let me ask a stupid question: how do I fix it? :)


Adding a ' &' at the end of the first line? Unfortunately I cannot test it here, but it would be my very first approach.
Back to top
View user's profile Send private message
Dr.Willy
Guru
Guru


Joined: 15 Jul 2007
Posts: 492
Location: NRW, Germany

PostPosted: Mon Jun 09, 2014 10:38 pm    Post subject: Reply with quote

dachschaden wrote:
@Dr. Willy: For what earthly reason did you enable the modesetting driver? If my memory serves right the intel driver provices KMS (kernel mode setting), so you don't have to rely on user mode setting.

For the reason that it's not working without it.

At least for me X won't start without the modesetting driver installed due to the following X section.
Code:
Section "Device"
    Identifier "intel"
    Driver "modesetting"
    BusID          "PCI:0:2:0"
    VendorName     "onboard"
EndSection

Setting the driver to "intel" doesn't work either.

Also the README says:
Quote:
To use the NVIDIA driver as an RandR 1.4 output source provider, the X server needs to be configured to use the NVIDIA driver for its primary screen and to use the "modesetting" driver for the other graphics device.


r.osmanov wrote:
Well, README.bz2 suggests us to put these commands into ~/.xinitrc
Code:

exec ck-launch-session dbus-launch mate-session
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto

As it might have been guessed, the commands below exec ... mate-session are not executed. That's what I got so far. Let me ask a stupid question: how do I fix it? :)

… put the xrandr commands before the exec ... mate-session line?
Back to top
View user's profile Send private message
dachschaden
n00b
n00b


Joined: 07 Jun 2014
Posts: 65
Location: Germany

PostPosted: Mon Jun 09, 2014 11:56 pm    Post subject: Reply with quote

OK, I never had to assign that driver ... I might do another test on my box with the modesetting driver. Thanks for the suggestion, though.

If my memory serves right, xrandr only works on a running server. If you use it on a VT for example it will say that it can't open the display. That's why r.osmanov uses the -d :0 argument:

Code:

xrandr --listproviders -d :0


The XServer which cannot deliver the rendered scene to the display still runs by the nVidia driver, the "provider".
To get that information, he has to switch to a VT and then specify the display whose providers he wants to know.
Back to top
View user's profile Send private message
Yamakuzure
Advocate
Advocate


Joined: 21 Jun 2006
Posts: 2273
Location: Bardowick, Germany

PostPosted: Wed Jun 11, 2014 10:56 am    Post subject: Reply with quote

Hi, I have a laptop with a muxless hybrid, Intel HD versus Nvidia K2100M:
Code:
 ~ $ sudo lspci | grep -i vga
00:02.0 VGA compatible controller: Intel Corporation 4th Gen Core Processor Integrated Graphics Controller (rev 06)
01:00.0 VGA compatible controller: NVIDIA Corporation GK106GLM [Quadro K2100M] (rev ff)
The laptop is configured to use Intel only:
Code:
 ~ $ sudo eselect opengl list
Available OpenGL implementations:
  [1]   nvidia
  [2]   xorg-x11 *
Then I have installed bumblebee with primus offloader:
Code:
 ~ $ eix bumblebee
[I] x11-misc/bumblebee
     Available versions:  3.2.1 (**)9999[1] {+bbswitch VIDEO_CARDS="nouveau nvidia"}
     Installed versions:  9999[1](13:32:49 30.05.2014)(bbswitch VIDEO_CARDS="nvidia -nouveau")
     Homepage:            http://bumblebee-project.org https://github.com/Bumblebee-Project/Bumblebee
     Description:         Service providing elegant and stable means of managing Optimus graphics chipsets

[1] "bumblebee" /var/lib/layman/bumblebee
 ~ $ eix -I primus
[I] x11-misc/primus [1]
     Available versions:  (**)9999 {ABI_MIPS="n32 n64 o32" ABI_X86="32 64 x32"}
     Installed versions:  9999(13:34:48 30.05.2014)(ABI_MIPS="-n32 -n64 -o32" ABI_X86="64 -32 -x32")
     Homepage:            https://github.com/amonakov/primus
     Description:         Faster OpenGL offloading for Bumblebee

[1] "bumblebee" /var/lib/layman/bumblebee
And here is the result:
Code:
# With intel card:
 ~ $ glxspheres64
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) Haswell Mobile
225.416832 frames/sec - 251.565184 Mpixels/sec
191.208057 frames/sec - 213.388192 Mpixels/sec
190.810194 frames/sec - 212.944177 Mpixels/sec

# With nvidia using optirun
 ~ $ optirun glxspheres64
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: Quadro K2100M/PCIe/SSE2
366.242147 frames/sec - 408.726236 Mpixels/sec
372.048190 frames/sec - 415.205780 Mpixels/sec
367.461145 frames/sec - 410.086638 Mpixels/sec

# With nvidia using primusrun:
 ~ $ primusrun glxspheres64
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: Quadro K2100M/PCIe/SSE2
463.456412 frames/sec - 517.217356 Mpixels/sec
481.095675 frames/sec - 536.902773 Mpixels/sec
481.906130 frames/sec - 537.807242 Mpixels/sec
I never got the hybrid system to work in any other way. xrandr always shows only one provider, no matter what.

It seems there is no way the nvidia driver can do this on its own. And if you look into the nvidia systemsetting under windows on a hybrid laptop (I have a Windows 7 dual boot via EFI) it does the same. Use the integrated chipset and activate the dedicated graphics card for chosen applications only.

The ideal solution would be to have nvidia drivers on linux that can be configured the same way. Automagically use the integrated chipset and only activate the nvidia card where configured to do so. (Going automatically for OpenGL is useless if a window manager uses OpenGl. That would result in a permanently powered on card, although the intel chip is perfectly capable to handle that.)

So until then, there at least *is* bumblebee/primus. ;-)
_________________
Important German:
  1. "Aha" - German reaction to pretend that you are really interested while giving no f*ck.
  2. "Tja" - German reaction to the apocalypse, nuclear war, an alien invasion or no bread in the house.
Back to top
View user's profile Send private message
Princess Nell
l33t
l33t


Joined: 15 Apr 2005
Posts: 751

PostPosted: Wed Jun 11, 2014 11:16 pm    Post subject: Reply with quote

@r.osmanov:

How do you start X? If you use a display manager, it needs to run those xrandr commands you currently have in .xinitrc. E.g. I use lightdm with mate, and I have set it up with a helper script through the display-setup-script variable. Other DMs are listed at http://wiki.gentoo.org/wiki/NVIDIA_Driver_with_Optimus_Laptops. If you use startx, however, the .xinitrc method should work.

My xorg.conf is not a million miles away from yours. There are no Monitor and ServerFlags sections, the nvidia screen has no monitor line, accordingly, but I have the "UseDisplayDevice" "none" option. The intel screen, consequently, also has no monitor line. Xorg is version 1.15.0.
Back to top
View user's profile Send private message
r.osmanov
n00b
n00b


Joined: 21 Oct 2011
Posts: 28

PostPosted: Thu Jun 12, 2014 11:24 am    Post subject: Reply with quote

Princess Nell wrote:
@r.osmanov:

How do you start X? If you use a display manager, it needs to run those xrandr commands you currently have in .xinitrc. E.g. I use lightdm with mate, and I have set it up with a helper script through the display-setup-script variable. Other DMs are listed at http://wiki.gentoo.org/wiki/NVIDIA_Driver_with_Optimus_Laptops. If you use startx, however, the .xinitrc method should work.


I've managed to configure rendering via NVIDIA card (/usr/share/doc/nvidia-drivers-334.21-r3/README.bz2, Chapter 33) with LightDM + Openbox:

/etc/lightdm/lightdm.conf
Code:
[LightDM]
session-wrapper=/etc/lightdm/Xsession

[SeatDefaults]
session-wrapper=/etc/lightdm/Xsession

[Seat:0]
xserver-config        = /etc/X11/xorg.conf.optimus
xserver-command       = X -seat 0
xserver-share         = True
display-setup-script  = /opt/lightdm/bin/xrandr-optimus



/opt/lightdm/bin/xrandr-optimus
Code:

# Optimus (for /etc/lightdm/lightdm.conf and /etc/X11/xorg.conf.optimus)
xrandr --setprovideroutputsource modesetting NVIDIA-0 2>&1 >> /tmp/optimus.log
xrandr --auto  2>&1 >> /tmp/optimus.log


/etc/X11/xorg.conf.optimus
Code:

Section "Module"
   # Disable        "dri"
   # Disable        "fb"
EndSection

Section "ServerFlags"
    Option   "AutoAddDevices" "true"
    Option   "AllowEmptyInput" "no"
EndSection

Section "Monitor"
   Identifier     "Monitor0"
   VendorName     "Unknown"
   ModelName      "Unknown"
   HorizSync       28.0 - 73.0
   VertRefresh     43.0 - 72.0
   Option         "DPMS"
   Modeline      "1600x900"    107.80  1600 1648 1680 1920  900 903 908 936 -hsync -vsync
EndSection

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "nvidia"
    Inactive "intel"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID          "PCI:1:0:0"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
    Monitor        "Monitor0"
    Option "AllowEmptyInitialConfiguration"

   DefaultDepth    24
    Option         "metamodes" "1600x900 +0+0; nvidia-auto-select +0+0"
   SubSection     "Display"
      Depth       24
      Modes      "1600x900" "1280x1024"
   EndSubSection
EndSection

Section "Device"
    Identifier "intel"
    Driver "modesetting"
    #Driver "intel"
    BusID          "PCI:0:2:0"
    VendorName     "onboard"
EndSection

Section "Screen"
    Identifier "intel"
    Device "intel"
    Monitor        "Monitor0"

   DefaultDepth    24
    Option         "metamodes" "1600x900 +0+0"
   SubSection     "Display"
      Depth       24
      Modes      "1600x900"
EndSubSection
EndSection


There is no ~/.xinitrc, and ~/.xprofile has nothing to do with xrandr/optimus/WM.

I haven't tried to test it with running bumblebee yet.

So, it works with Openbox. But shows black screen in Xfce xession, althought the X server commands are alike:

Openbox
Code:

root      5815  9.5  0.5 160644 46952 tty8     Ss+  17:05   0:00 /usr/bin/X -seat 0 :1 -config /etc/X11/xorg.conf.optimus -seat seat0 -auth /var/run/lightdm/root/:1 -nolisten tcp vt8 -novtswitch
ruslan    5841  3.4  0.4 326192 33012 ?        Ss   17:05   0:00 /usr/bin/openbox --startup /usr/libexec/openbox-autostart OPENBOX
root      6063  0.0  0.0 112580   960 tty1     S+   17:05   0:00 grep --colour=auto X


Xfce
Code:

root      5392  6.1  0.6 164524 51188 tty8     Ss+  17:04   0:00 /usr/bin/X -seat 0 :1 -config /etc/X11/xorg.conf.optimus -seat seat0 -auth /var/run/lightdm/root/:1 -nolisten tcp vt8 -novtswitch
ruslan    5416  0.0  0.0 115344  1588 ?        Ss   17:04   0:00 /bin/sh /etc/xdg/xfce4/xinitrc -- /etc/X11/xinit/xserverrc
root      5689  0.0  0.0 112580   956 tty1     S+   17:04   0:00 grep --colour=auto X


So I've almost got it :roll: . It would be perfectly good for me to have diffent Openbox/Xfce sessions for Intel and for NVIDIA.

Btw, analogous configuration worked when I launched X via startx.
Back to top
View user's profile Send private message
r.osmanov
n00b
n00b


Joined: 21 Oct 2011
Posts: 28

PostPosted: Sun Jun 15, 2014 5:00 am    Post subject: Reply with quote

Finally, I've got it. Two VTs: vt7 on Intel, vt8 on NVIDIA card as off-loader.
However, it all works terribly slow making only about 50 FPS (glxgears,
Enemy Territory), so I'll stick with the good old Bumblebee and give
Primus a try (can't see any performance gains so far).

The following hopefully will save someone's time a little.

/etc/lightdm/lightdm.conf
Code:

[LightDM]
greeter-user=lightdm
user-authority-in-system-dir=false
session-wrapper=/etc/lightdm/Xsession

[SeatDefaults]
session-wrapper=/etc/lightdm/Xsession

[Seat:0]
xserver-layout          = INTEL_MODESETTING
xserver-command         = X -seat seat0
xserver-share           = false

[Seat:1]
xserver-layout          = NVIDIA
xserver-command         = X -seat seat1
xserver-share           = false
display-setup-script    = /opt/lightdm/bin/xrandr-optimus
autologin-user          = ruslan


/etc/X11/xorg.conf.d/00-layouts.conf
Code:

Section "ServerLayout"
  Identifier "INTEL"
  Screen 0 "intel"
EndSection

Section "ServerLayout"
  Identifier "INTEL_MODESETTING"
  Screen 0 "intel_modesetting"
  #Inactive "nvidia"
  #Option "Clone" "off"
EndSection

Section "ServerLayout"
  Identifier "NVIDIA"
  Screen 0 "nvidia"
  Inactive "intel_modesetting"
  #Inactive "intel"
  #Option "Clone" "off"
EndSection


/etc/X11/xorg.conf.d/01-server-flags.conf
Code:

Section "ServerFlags"
  # only pertains to mousedrv and vmmouse and has no effect on evdev or
  # others. It is probably not needed, but will not hurt us here
  Option  "AutoAddDevices" "false"
  # lets us control which seat gets which input devices
  Option  "AutoEnableDevices" "false"
  # prevents killing X with a Ctrl+Alt+Backspace event
  #Option "DontZap" "false"
  Option  "AllowEmptyInput" "no"
  Option  "DefaultServerLayout" "INTEL"
  Option "Xinerama" "off"
  # stop the X server adding non-primary devices as GPU screens
  Option "AutoAddGPU" "off"
  Option "ProbeAllGpus" "false"
  Option "DRI2" "on"
EndSection


/etc/X11/xorg.conf.d/02-monitor.conf
Code:

Section "Monitor"
  Identifier "Monitor0"
  VendorName "Unknown"
  ModelName "Unknown"
  HorizSync 28.0 - 73.0
  VertRefresh 43.0 - 72.0
  Option "DPMS"
  Modeline "1600x900" 107.80 1600 1648 1680 1920 900 903 908 936 -hsync -vsync
EndSection


/etc/X11/xorg.conf.d/03-devices.conf
Code:

Section "Device"
  Identifier "nvidia"
  Driver "nvidia"
  BusID "PCI:1:0:0"
  Option "RegistryDwords" "EnableBrightnessControl=1"
EndSection

Section "Device"
  Identifier "intel"
  Driver "intel"
  BusID "PCI:0:2:0"
  VendorName "onboard"
  Option "AccelMethod" "sna"
  Option "Backlight" "intel_backlight" # use your backlight that works here
EndSection

Section "Device"
  Identifier "intel_modesetting"
  Driver "modesetting"
  #Driver "intel"
  BusID "PCI:0:2:0"
  VendorName "onboard"
  Option "AccelMethod" "sna"
  Option "Backlight" "intel_backlight" # use your backlight that works here
EndSection


/etc/X11/xorg.conf.d/04-screens.conf
Code:

Section "Screen"
  Identifier "nvidia"
  Device "nvidia"
  Monitor "Monitor0"
  Option "AllowEmptyInitialConfiguration"
  DefaultDepth 24
  Option "metamodes" "1600x900 +0+0; nvidia-auto-select +0+0"
  SubSection     "Display"
    Depth 24
    Modes "1600x900"
  EndSubSection
EndSection

Section "Screen"
  Identifier "intel"
  Device "intel"
  Monitor "Monitor0"
  DefaultDepth 24
  Option "metamodes" "1600x900 +0+0"
  SubSection "Display"
    Depth 24
    Modes "1600x900"
  EndSubSection
EndSection

Section "Screen"
  Identifier "intel_modesetting"
  Device "intel_modesetting"
  Monitor "Monitor0"
  DefaultDepth 24
  Option "metamodes" "1600x900 +0+0"
  SubSection "Display"
    Depth 24
    Modes "1600x900"
  EndSubSection
EndSection


(xorg.conf is modular)

Thank you all for replies.
Back to top
View user's profile Send private message
SDNick484
Apprentice
Apprentice


Joined: 05 Dec 2005
Posts: 212

PostPosted: Mon Nov 24, 2014 9:27 am    Post subject: Reply with quote

r.osmanov wrote:
Finally, I've got it. Two VTs: vt7 on Intel, vt8 on NVIDIA card as off-loader.
However, it all works terribly slow making only about 50 FPS (glxgears,
Enemy Territory), so I'll stick with the good old Bumblebee and give
Primus a try (can't see any performance gains so far).

The following hopefully will save someone's time a little.
...


I'm curious if you stuck with these settings or have made any updates? I'm trying to get primusrun to work, but am running into: "primus: fatal: failed to acquire direct rendering context for display thread"
Back to top
View user's profile Send private message
r.osmanov
n00b
n00b


Joined: 21 Oct 2011
Posts: 28

PostPosted: Mon Nov 24, 2014 10:23 am    Post subject: Reply with quote

SDNick484 wrote:
r.osmanov wrote:
Finally, I've got it. Two VTs: vt7 on Intel, vt8 on NVIDIA card as off-loader.
However, it all works terribly slow making only about 50 FPS (glxgears,
Enemy Territory), so I'll stick with the good old Bumblebee and give
Primus a try (can't see any performance gains so far).

The following hopefully will save someone's time a little.
...


I'm curious if you stuck with these settings or have made any updates? I'm trying to get primusrun to work, but am running into: "primus: fatal: failed to acquire direct rendering context for display thread"


I gave up configuring primusrun. It is not working:
Code:

Xlib:  extension "GLX" missing on display ":0.0".
primus: fatal: broken GLX on main X display


However, optirun is fine. At least I'm satisfied with it.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Kernel & Hardware All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum