`
bit1129
  • 浏览: 1051510 次
  • 性别: Icon_minigender_1
  • 来自: 北京
社区版块
存档分类
最新评论

【JVM七】JVM知识点总结

    博客分类:
  • JVM
 
阅读更多

 

1. JVM运行模式

1.1 JVM运行时分为-server和-client两种模式,在32位机器上只有client模式的JVM。通常,64位的JVM默认都是使用server模式,因为server模式的JVM虽然启动慢点,但是,在运行过程,JVM会尽可能的进行优化

1.2 JVM分为三种字节码解释执行方式:mixed mode, interpret mode以及compiler mdoe

 

如下命令显示JVM使用了mixed mode,何为mixed mode:

[hadoop@hadoop jvms]$ java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

 

重要信息

1. 64-Bit Server

1.Mixed mode of execution is the default mode of HotSpot and means that the JVM dynamically compiles byte code into native code at run time

 

2.执行程序前强制输出JVM的信息

[hadoop@hadoop jvms]$ java -showversion HelloJvm
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

This is HelloJvm

 3. mixed mode/compiled mode/interpreted mode

 

  

[hadoop@hadoop jvms]$ java -showversion -Xmixed HelloJvm
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

This is HelloJvm

 

 

[hadoop@hadoop jvms]$ java -showversion -Xcomp HelloJvm
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, compiled mode)

This is HelloJvm

 

[hadoop@hadoop jvms]$ java -showversion -Xint HelloJvm
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, interpreted mode)

This is HelloJvm

 

 

为什么不选用-Xcomp和-Xint这两种模式?

The -Xint flag forces the JVM to execute all bytecode in interpreted mode, which comes along with a considerable slowdown, usually factor 10 or higher. On the contrary, the flag -Xcomp forces exactly the opposite behavior, that is, the JVM compiles all bytecode into native code on first use, thereby applying maximum optimization level. This sounds nice, because it completely avoids the slow interpreter. However, many applications will also suffer at least a bit from the use of -Xcomp, even if the drop in performance is not comparable with the one resulting from -Xint. The reason is that by setting -Xcomp we prevent the JVM from making use of its JIT compiler to full effect. The JIT compiler creates method usage profiles at run time and then optimizes single methods (or parts of them) step by step, and sometimes speculatively, to the actual application behavior. Some of these optimization techniques, e.g., optimistic branch prediction, cannot be applied effectively without first profiling the application. Another aspect is that methods are only getting compiled at all when they prove themselves relevant, i.e., constitute some kind of hot spot in the application. Methods that are called rarely (or even only once) are continued to be executed in interpreted mode, thus saving the compilation and optimization cost.

 

2.什么是JIT即时编译

The input to the JVM is bytecode, and then there are several choices how it handles the bytecode during program execution. At times the JVM interprets bytecode only, without compiling it into native code first. In fact, every Java program you run will normally have some fraction of interpreted bytecode during its execution.

Early JVMs only contained an interpreter, so your whole Java program bytecode was interpreted only. That’s the main reason why Java was (rightly) considered slow in its early years. Today, modern JVMs still allow you to use interpreted-only mode by specifying -Xint on the command line. Just add -XX:+PrintCompilation to the command line in addition to -Xint and you can see yourself that no compilation of bytecode into native code will happen. Compare the output with a run of the same Java program without -Xint.

When the JVM started to support bytecode-to-native compilation, people realized that it does not make sense to blindly compile each and every method into native code. Instead, the concepts/technologies nowadays known as “Just-in-time-compilation” and “HotSpot” were developed. In a nutshell, after startup the JVM first of all interprets the bytecode and then during program execution decides which methods to compile into native code. The idea is that only “hot” methods are worth the compiling/optimization effort required to produce efficient native code. On the contrary, “cold” methods will be handled in interpreted mode until they become “hot” – which might never happen for some methods.

By the way, when you dynamically reload classes or instrument methods at run time, the new bytecode will usually also be interpreted for a while even if the old version was already compiled. So, for every new piece of bytecode it sees, the JVM normally takes a while to decide whether it is “hot”.

This means that you will indeed find bytecode interpretation with every modern JVM execution. If you would like every method to be compiled into native code the first time it is called, you could specify -Xcomp on the command line, but I cannot really recommed this approach. JVMs are pretty clever nowadays!

 

JVM执行时,在comp模式下,立即解释为本地码;对于Xint,则永远不编译为本地码;对于Xmixed,则对访问频繁的方法进行优化,翻译成本地码,对于不常用的则不翻译

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics