Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I just got past this part today. I was just waiting and pressing the F when the "!" indication showed up, and went fine.

It's in the timing, you need to get it just right or he will just carry on. You tube videos are handy for working out how to get past the battle.
 
I can't remember that fight being too difficult... but maybe it's because I use my Xbox controller to play the game. I like the keyboard and mouse for most of my PC gaming... but this style of game, the controller plays better... and with the sound effects in this game... rumble of controller adds to the experience.

Anyhow... first thing I upgrade is all my brawl upgrades. Anything, big or hard to shoot... I keep dodging till it tells you to strike. For the Xbox controller it's the 'Y' button.

Speaking of adding to the experience... bought these last night. I may have to replay Tomb Raider again in 3D...:D
 

Attachments

  • geforce-3d-vision2-glasses-kit-low-3qtr.png
    geforce-3d-vision2-glasses-kit-low-3qtr.png
    51.1 KB · Views: 182
It is probably good idea to play this with controller. Fights are too easy for me with mouse and keyboard even on hard difficulty. I usualy only die on QTE or because of overconfidence = bring bow to a gunfight :)
 
Just finished the game (again). Worked splendid :)
Hats off to the gents and ladies at Feral for making this possible!
(part from the motion blur issue I mentioned a few posts back, it worked swell!).
 
Just to give some detail (to those interested) here is a simplistic explanation of Core vs Legacy(aka Compatibility) GL.

CoreGL allows access to OpenGL 3.0 through to 4.4.
LegacyGL (aka Compatibility) supports 1.1 to 4.4 meaning it supports all the older stuff as well as the new. However the no OpenGL4 features are exposed on Legacy (but they could be in the future in theory).

(Apple have fully implemented all features up to 4.1 and some of 4.2)

In compatibility all your old code will still run and you get also access to new features. However this extended support comes at a cost.

In core, all the older, deprecated stuff does not exist as the CoreGL profile as it is based on brand new code written from scratch this means it is usually faster as it does not have any of the older legacy stuff to slow it down, it's also where Apple put most of their development effort.

It takes some work to move over to the new CoreGL profile as you need to get everything working and remove all legacy GL calls etc. However it does mean you get all the benefits of the more optimised and newer implementation.

However you cannot always use the CoreGL profile as in some cases the older code is faster especially it seems on the older (or integrated) cards. TombRaider uses both methods right now depending on your hardware and I suspect a few (Feral) games will have both options but over time all games will start to only use CoreGL especially when the older cards are no longer supported.

Edwin
I tried the legacy openGL mode to see how it performed. I didn't expect such a difference. 28 fps instead of 42 in the benchmark, and the difference looks even bigger in game when there is a lot of detail.
I don't know if you guys spent less time optimizing for legacy mode or if the difference just represents the jump in performance of the new GL stack, but I'd like to have more games using the new openGL. :)
 
I tried the legacy openGL mode to see how it performed. I didn't expect such a difference. 28 fps instead of 42 in the benchmark, and the difference looks even bigger in game when there is a lot of detail.
I don't know if you guys spent less time optimizing for legacy mode or if the difference just represents the jump in performance of the new GL stack, but I'd like to have more games using the new openGL. :)

In some modern games it uses features that are just that much quicker in OpenGL 4. It's not always the case but in the case of Tomb Raider the geometry shaders give a really big boost especially in the more complex areas. That said the game still runs faster in Legacy mode on Intel hardware as the Intel drivers are faster in Legacy right now (at least for this game).
 
@edddeduck : on my late 2013 iMac (780M + core i5), in native rez, the game runs at about 30fps fully maxed in the beginning zone (the cave), and I was obviously disappointed considering indoors runs at 60fps on Windows easily.

Then I ticked "Use Legacy OpenGL", and voilà, 60fps in OSX too ! Almost as good as Windows (not as good for outdoors but it's ok), day & night with legacy OpenGL.

Is it normal ?

Please don't drop this feature, because without it, TR would have been running terrible on my Mac !
 
Last edited:
Is it normal ?

Sounds like your Mac has a bug as it runs faster here on similar hardware. In some cases the LegacyGL runs faster especially if you have older drivers or hit a driver bug this is why we left the option in the game for people with problems with their OS/Drivers.

Also how can you tell it's 30fps? There is no frame rate counter in the game, last time I checked we get ~50+ on that card

Edwin
 
How can I tell it's 30fps ? I have eyes that can see when a game runs at 30fps or 50fps.

I have the PS4 version which runs at 60fps, PS3 version which is locked at 30fps, and I have it on Windows too, so I know what 30fps is vs 50fps.

My Mac "has a bug" ? What ? Mac have "bugs" ? It's not like I can install NVIDIA Drivers or whatever since all are included in OS X already...

I'm on 10.9.1 BTW.

And that's in 2560x1440 (native rez)... If I down the rez a little, it's smoother. Nevertheless, in 1440p, it's much faster in Legacy OGL.


EDIT : and my eyes were right, just did the benchmark in game :

Legacy OpenGL mode OFF : min FPS : 14.7 max fps 34.9 average : 30.9
Legacy OpenGL mode ON : min FPS : 42.4 max fps : 64.6 average : 54 fps

Settings are : I deleted the preferences data, disabled vsync, and changed resolution to 1440p. Left everything else default.

So if my iMac has issues, please enlighten me so I can "fix" this. I don't think I can just call Apple and tell them "I think my iMac has issues because Tomb Raider runs really good with Legacy OpenGL but not without it" can I ?

And again, my iMac is a 27" late 2013 one with the core i5 + 780M upgrade. Other games seem to run fine.
 
Last edited:
EDIT : and my eyes were right, just did the benchmark in game :

Legacy OpenGL mode OFF : min FPS : 14.7 max fps 34.9 average : 30.9
Legacy OpenGL mode ON : min FPS : 42.4 max fps : 64.6 average : 54 fps

Core i7 and 780M. While running the benchmark on native Resolution with everything maxed...

Min FPS 13.5
Max FPS 31.2
Average FPS 28.0

Did you tried the same benchmark on windows?
 
Last edited:
So if my iMac has issues, please enlighten me so I can "fix" this. I don't think I can just call Apple and tell them "I think my iMac has issues because Tomb Raider runs really good with Legacy OpenGL but not without it" can I?

I don't know exactly what is causing the drop but it ran faster here than it does on your Mac, could be a number of factors. I'd wait for the next OS update that should speed up your Machine a little more

And again, my iMac is a 27" late 2013 one with the core i5 + 780M upgrade. Other games seem to run fine.

Yes, you have a high end iMac and all games including Tomb Raider run fine on your iMac model.

Edwin
 
Just to weigh in, I tried the legacy GL settings and it made the same difference to my 780M setup (2560x1440, settings slightly lower than High):

CoreGL: 24, 35, 31
LegacyGL: 41, 60, 53

Much better, although still nowhere near my Windows levels, where I can go slightly higher than Ultra comfortably.
 
Just to weigh in, I tried the legacy GL settings and it made the same difference to my 780M setup (2560x1440, settings slightly lower than High):

CoreGL: 24, 35, 31
LegacyGL: 41, 60, 53

Much better, although still nowhere near my Windows levels, where I can go slightly higher than Ultra comfortably.

FYI the most complex areas in the game to test the performance is the entering Shanty town. In this area you will notice a massive boost using CoreGL (about double legacy mode on most cards).

In some Mac/PC tests using a Nvidia 650M the Mac was actually a few fps fasters than the PC in these areas. However in the simpler areas the Windows version can perform better, in general the lower the load the faster Windows runs. In general the Mac and PC should be in a similar ballpark.

Some settings can slow the Mac down more than the PC as we don't have OpenGL 4.2-4.4 which would allow you to match DX11 feature for feature while maintaining performance.

Edwin

UPDATE: Did some testing to check what's going on with the NV 700 series. I did a test on "High" with the next OS update you will gain about 10/12 fps on the CoreGL taking the average fps to ~45. The LegacyGL is around ~60 in the fps test but is slower when you enter Shanty Town compared to the CoreGL profile which has a more stable fps in all areas of the game. The improvement is more noticeable on AMD cards.
 
Last edited:
Thanks for reenforcing my point. As I thought, my Mac isn't buggy, which is quite reassuring.

Let's wait for 10.9.2 I guess, which seems to be where TR was tested on...

Just to weigh in, I tried the legacy GL settings and it made the same difference to my 780M setup (2560x1440, settings slightly lower than High):

CoreGL: 24, 35, 31
LegacyGL: 41, 60, 53

Much better, although still nowhere near my Windows levels, where I can go slightly higher than Ultra comfortably.
 
Let's wait for 10.9.2 I guess, which seems to be where TR was tested on...

Tomb Raider was tested using 10.9 (and 10.8 although we dropped 10.8 for performance reasons) but there will be improvements in the newer OS's like 10.9.2 as Nvidia fix bugs in the 700 series drivers. Other series of cards will also benefit but the latest cards usually have the most bugs and the biggest improvements.
 
Yes I did, it's much faster, and I sometimes have more than 100fps.

I still consider the OS X version playable though with Legacy On.

Core i7 and 780M. While running the benchmark on native Resolution with everything maxed...

Min FPS 13.5
Max FPS 31.2
Average FPS 28.0

Did you tried the same benchmark on windows?
 
Is "Tessalation" comming to the Mac? What does " High Precision" do?

Yes, we can enable Hardware tessellation once some driver bugs are fixed.

Quality: Determines the overall quality of the graphics. It’s a good baseline to know what to expect from each of the settings.
Texture Quality: Keep this one on “normal” unless you have a lot of video memory to spare. Medium performance hit.
Texture Filter: Determines the level of anisotropy (e.g. how clear the textures look). Unless you’re running a low-end graphics card, you shouldn’t have any problem bringing this all the way up. Low performance hit.
Hair Quality: Makes Lara’s hair look realistic. I recommend setting it to TressFX for the best experience—but only if you have an AMD card. It doesn’t work well for nVidia cards. Very high performance hit.
Anti-Aliasing: Smooths jagged edges. It’ll give you a major performance hit unless you set it on FXAA or turn it off. Computers with high-end set ups should experiment with 2xSSAA and 4xSSAA
Shadows: Determines how many shadows are projected onto the scene. Leave this one on Normal.
Shadow Resolution: Defines the quality of the shadows. High performance hit.
Level of Detail: Determines how objects look from a distance. Unless you need to see high-quality leaves and flowers from miles away, leave this one on normal. Medium performance hit.
Reflections: Determines which objects cast reflections and at what quality. Medium performance hit.
Depth of Field: Gives close-up scenes a “bokeh” appearance in the background. Medium performance hit, but only during those occasions.
SSAO: This stands for “screen space ambient occlusion”. This is one of the biggest performance-affecting settings in the game. Adds global shadow and lighting effects to every scene and allows objects to cast realistic shadows on other objects in the scene. Goes all the way up to Ultra.
Post-Processing: Adds a bunch of motion blur. High performance hit.
Tessellation: Adds detail to models in the game. Currently problematic and related to crash issues, recommend setting it to ‘off’ for now.
High Precision: This poorly defined name describes a feature that adds darker shades and different intensities of light to the game. In short, it makes the game look more realistic. Minor performance hit. Keep it on.

http://www.gamefront.com/tomb-raider-2013-graphical-preset-comparisons-performance-tips/
 
Yes, we can enable Hardware tessellation once some driver bugs are fixed.

Quality: Determines the overall quality of the graphics. It’s a good baseline to know what to expect from each of the settings.
Texture Quality: Keep this one on “normal” unless you have a lot of video memory to spare. Medium performance hit.
Texture Filter: Determines the level of anisotropy (e.g. how clear the textures look). Unless you’re running a low-end graphics card, you shouldn’t have any problem bringing this all the way up. Low performance hit.
Hair Quality: Makes Lara’s hair look realistic. I recommend setting it to TressFX for the best experience—but only if you have an AMD card. It doesn’t work well for nVidia cards. Very high performance hit.
Anti-Aliasing: Smooths jagged edges. It’ll give you a major performance hit unless you set it on FXAA or turn it off. Computers with high-end set ups should experiment with 2xSSAA and 4xSSAA
Shadows: Determines how many shadows are projected onto the scene. Leave this one on Normal.
Shadow Resolution: Defines the quality of the shadows. High performance hit.
Level of Detail: Determines how objects look from a distance. Unless you need to see high-quality leaves and flowers from miles away, leave this one on normal. Medium performance hit.
Reflections: Determines which objects cast reflections and at what quality. Medium performance hit.
Depth of Field: Gives close-up scenes a “bokeh” appearance in the background. Medium performance hit, but only during those occasions.
SSAO: This stands for “screen space ambient occlusion”. This is one of the biggest performance-affecting settings in the game. Adds global shadow and lighting effects to every scene and allows objects to cast realistic shadows on other objects in the scene. Goes all the way up to Ultra.
Post-Processing: Adds a bunch of motion blur. High performance hit.
Tessellation: Adds detail to models in the game. Currently problematic and related to crash issues, recommend setting it to ‘off’ for now.
High Precision: This poorly defined name describes a feature that adds darker shades and different intensities of light to the game. In short, it makes the game look more realistic. Minor performance hit. Keep it on.

http://www.gamefront.com/tomb-raider-2013-graphical-preset-comparisons-performance-tips/
Hmm... no 'Hair Quality' or 'Tessellation' showing up for me.
 
Hmm... no 'Hair Quality' or 'Tessellation' showing up for me.

Already answered both in the thread :) Simply put Hair Quality requires an API from AMD and Tessellation requires some bug fixes in the OS.

TressFX requires AMD to implement the API on the Mac. The game can support it (and the code is ready if it happens) but the feature does not exist. It's the same as the PS3 and 360, they don't have TressFX either as the feature is missing from the platform.

If in the future it is released turning it on should be a very simple process.

Edwin

Yes, we can enable Hardware tessellation once some driver bugs are fixed.
 
Hi Edwin,

First of all, thank you for this wonderful port of Tomb Raider!

My new MacBook Pro Retina is capable of running this game on a pretty comfortable framerate (~40 fps average). But is there a way to lock the framerate to 30 fps like it is on consoles?

I do this with all games on the Mac if possible to make the FPS more consistent but i didn't saw an option on TR yet..
 
Edwin, is there any way to show FPS in the game itself, as opposed to the benchmark?

Most games don't come with On Screen Display to view FPS during game play. I use Rivatuner in conjunction with EVGA Precision X to display GPU stats, FPS, and screen capture during game play... I took this screen capture in TR when I was playing it of a guy who exploded himself into a wall... LOL...:D

Spoiler Image

My OSD GPU stats are in the top left corner... and FPS counter is on the bottom row.

I'm not sure about Macs... but maybe there is a FRAPS utility you can use that displays FPS during gameplay like Rivatuner/Precision X.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.