Native resolution on a LCD (and LED) offers the best possible image quality and performance. You will not see an improvement running at higher resolutions. Set it higher, and if it accepts resolution, it will have to scale the imge down to the native resolution. Running a lower resolution has a similar effect, although the performance boost may out weigh the latency from scaling (as we are talking milliseconds). Some setups actually run much smother at the native resolution than they do at lower resolutions, even though they should in theory run quicker, however, while that is not unusual, it is also not common. Just for sh!ts and giggles, here is some simple math...
| resolution | ----> | pixels |
| 1024 x 768 | | 786,432 |
| 1280 x 1024 | | 1,310,720 |
| 1680 x 1050 | | 1,764,000 |
| 1920 x 1080 | | 2,073,600 |
As you can see, as the resolution goes up, the number of pixels that must be calculated for color go up as well and do so dramatically. The level of detail goes up as well, meaning the gpu must draw more polygons. High quality effects, Special lighting, and complex shadows take away resources as well by adding to the number of calculations for each frame (image). Now, factor in the cpu. Each model, object, etc's position for each frame is calculated and with higher resolution means higher number of coordinates for edges etc (you get the picture). Higher resolutions = more complex math.
Its dificult to find a good balance when both the video card and cpu are unable to run full settings.