Just as an experiment, I took a screenshot of TSC running on the Feather River Canyon Enhanced Route. I took the screenshot with my phone's camera at 48 Megapixels. My monitor is a Viewsonic 27/28 Gaming monitor. I have it set to 1900x1020p res and a refresh rate of 180Hz. I also use RW Enhancer 2. I had to upload this to Google Drive as it is too big for this forum. Here is the link to view it. Having a higher refresh rate makes your game's graphics look sharper. https://drive.google.com/file/d/1NBsTFY923388boX-TXEWhQTa0UNlqL8f/view?usp=sharing
Cant dwell in the past, but after this weekend, I regret going TSW first! TSC with some enhancements/mods looks pretty darn good! example: The tunnels leaving Liverpool lime street!
Why did you photograph your monitor instead of taking a screenshot? Not sure I'd want TSC to look like TSW if I'm honest. I think TSW looks too shiny and processed. Graphically it's more sophisticated, but I don't think it looks as real.
Because a screenshot doesn't actually take what you see on your screen. This article that I found on the internet may explain it to you. Taking a screenshot is something that is "programmed". Different software systems will go about handling of a "screen shot" differently. By programmed, I mean that the developer has to write actual code that manages the data that makes up the final image. The one thing that a screen shot is not: It's not a literal "screen shot" of data that the monitor is displaying. It typically is a copy of the data that is about to be sent off to be displayed such as the frame buffer, or a copy of a swap image. The data is coming from the GPU not the display device. Any setting on the display device such as brightness are not capture, no should they be as this would make the image specific to a single user with their specific device settings making the image look different on every single device it is displayed on. The goal of manufacturers is to set a display device up so that it displays data in a way that adheres to a specification. The image is then captured in a format that also adheres to a spec, such as sRGB. If this is done accurately then the captured image should look the same on any display device. The image is then saved in a format that tells others what specification it is adhering to. That information is saved inside files such as jpg. However, taking an exact "grab" of the data sitting on the GPU will not necessarily produce good results and most programs will modify it in some way such as to do some anti-aliasing. It must also be reformatted to match the spec of a image format. An HDR image may need to be converted to SDR if it is going to be saved to a jpg for example. Screen grab software, whether it is part a game engine, or a separate piece of software often have settings that allow the image to be modified as well. The exact technical details are going to differ between programs and even between graphics API's. DirectX will be different from OpenGL. Fundamentally, modern GPU's adhere to a specification and the data coming out of the GPU is formatted to that spec. The two most popular spec's are HDMI and DisplayPort. The Display Device sends EDID (Extended Display Identification Data) to the GPU as part of that protocol. This tells the GPU some limited information about the display device. The GPU "remembers" this information but may not set aside an exact one-to-one mapping of pixel data, there is a buffer, but don't lock your thinking into anything like "oh there must be a one to one mapping there". The GPU is free to set aside any size chunk it so pleases and individual manufactures may have different ideas on what the correct amount of data to set aside is. When it is time to transmit the data, as agreed on by the protocol, that buffer is converted to the connection format such as HDMI. Once the data is converted and sent out over the wires using that protocol the GPU looses interest and it is up to the display device to convert the HDMI data to a displayable format. This is where the settings on the monitor come into play, brightness, color space, and all the other settings on the monitor are applied to the image that has been decoded from the HDMI format to whatever internal format the device needs to display the data.
Yes, but the only difference is the monitor. All your showing by photographing the monitor is how your monitor is displaying the image, so unless there's something unique about the monitor settings, a screen grab would have still been better. Plus... photographing your monitor doesn't actually look anything like what you see with your eyes normally. Contrast will be different, and your phone will process the image. Auto exposure will either over or under expose it depending on the screen's content at the time. Moire patterns can occur as well. Here's a photo of my monitor showing the same image as above. It looks nothing like what I can see with my eyes. Far more contrast, and colours are totally different. Highlights are burning out as well. I see none of that with my eyes. You don't understand how cameras work I think Refresh rates do not make your image sharper. Perceptually, moving images will appear to have less blur, but none of this will appear on a photograph of your screen, especially if you pause the game to take the photo... in that case you can have a refresh rate of 1Hz and it would make no difference. In fact, if you're not pausing the game to take the shot, it may well be sharper with a low refresh rate, depending upon what shutter speed your camera is using. If you have a 180Hz refresh rate, and your camera is using a 1/30th shutter speed, it will actually introduce more blur as there will be many frames being displayed while the shutter is open. If you want your game to look sharper I'd choose a resolution greater than 1900x1020p
A higher refresh rate does make your images look sharper, even at 1920x1080p. What Is Refresh Rate and Why It Matters? BTW, I run a Viewsonic FHD monitor. https://www.tradeinn.com/techinn/en...rgn6wGLC2d4S-fcbe7oQLa9W36rIAgzkaAjzgEALw_wcB Let's leave it at that; we agree to disagree, as I have no time to argue with you. I have seen from many of your posts on here that you can be quite argumentative.
While i don't mind TSW graphics to a point, I prefer TSC in that it actually does look more real if you use graphics enhancements. I just use reshade and tweak a few settings here and there - might not be everyone's cup of tea, but it opens the can of worms wide open. This is also using AP weather as well as track mods and tweaking the ingame graphics settings.
Nope.. technically, it's doing nothing whatsoever to the image. There are just more of them per second being displayed. I've already said that perceptually it reduces blur on fast moving scenes though... I guess you chose to not read the part of my post that says precisely the same thing the article you linked to says. However, blur caused by movement is an artifact of monitors with slow pixel response, and if your monitor is 180Hz then it won't have a slow pixel response (unless it's cheap) . Using a lower refresh rate doesn't make the image less sharp.. it just makes the motion less smooth.... the images being displayed are identical regardless of refresh rate. However, if you want to argue with facts, you go straight ahead... I guess it's just par for the course these days. [edit] No we don't. I don't agree to that at all. You're just wrong... factually, technically wrong.