Z370 / Z390 VRM Discussion Thread - Page 417 - Overclock.net - An Overclocking Community
Forum Jump: 

Z370 / Z390 VRM Discussion Thread

Reply
 
Thread Tools
post #4161 of 4248 (permalink) Old 08-11-2019, 06:34 AM
New to Overclock.net
 
robertr1's Avatar
 
Join Date: Sep 2013
Posts: 292
Rep: 13 (Unique: 9)
In my oc and learning adventures, it seems there's a relationship between VID and oc potential. I've been messing around oc'ing my 9900k (with HT off) and noticed something interesting over the past 2 days. Running the same tests and daily usage, my VID doesn't really go up much in relation to the vcore bump. I've gone from 5ghz to 5.3ghz and vcore bumped from 1.25 to 1.38 with llc high. So really everything's been pretty constant in between the system aside from the bump in frequency and vcore in bios. This is using static vcore.

Questions I trying to answer:
Why is adding vcore not making the vid go up in a similar fashion? In the screenshots below, I went from 1.34v to 1.38v for vcore (in bios with llc high) but max vid only jumped up by .010v in relation. This is running the same workloads?
Is LLC a static offset and not percent based? The vdroop difference is exactly the same ratio as the difference in the vcore bump I give it each time.

Screenshots below are HWinfo @ 5.2 and 5.3 after putting different loads on the system (real bench, cinebench, heavy desktop usage, cpuz, other benches and stress tests). Cinebench R15 @ 5.3 showing that system is scaling appropriately.

cpu z validation so you can see the full specs: https://valid.x86.fr/p1617w
Attached Thumbnails
Click image for larger version

Name:	2019-08-11 (3).png
Views:	114
Size:	336.4 KB
ID:	288060  

Click image for larger version

Name:	2019-08-10.png
Views:	46
Size:	313.5 KB
ID:	288062  

Click image for larger version

Name:	2019-08-11 (2).png
Views:	52
Size:	4.10 MB
ID:	288064  

robertr1 is offline  
Sponsored Links
Advertisement
 
post #4162 of 4248 (permalink) Old 08-11-2019, 09:41 AM
New to Overclock.net
 
Falkentyne's Avatar
 
Join Date: Dec 2007
Location: Riverside
Posts: 7,009
Rep: 429 (Unique: 298)
Quote: Originally Posted by robertr1 View Post
In my oc and learning adventures, it seems there's a relationship between VID and oc potential. I've been messing around oc'ing my 9900k (with HT off) and noticed something interesting over the past 2 days. Running the same tests and daily usage, my VID doesn't really go up much in relation to the vcore bump. I've gone from 5ghz to 5.3ghz and vcore bumped from 1.25 to 1.38 with llc high. So really everything's been pretty constant in between the system aside from the bump in frequency and vcore in bios. This is using static vcore.

Questions I trying to answer:
Why is adding vcore not making the vid go up in a similar fashion? In the screenshots below, I went from 1.34v to 1.38v for vcore (in bios with llc high) but max vid only jumped up by .010v in relation. This is running the same workloads?
Is LLC a static offset and not percent based? The vdroop difference is exactly the same ratio as the difference in the vcore bump I give it each time.

Screenshots below are HWinfo @ 5.2 and 5.3 after putting different loads on the system (real bench, cinebench, heavy desktop usage, cpuz, other benches and stress tests). Cinebench R15 @ 5.3 showing that system is scaling appropriately.

cpu z validation so you can see the full specs: https://valid.x86.fr/p1617w
Base VID is based on the CPU core ratio (assuming cache is 300 mhz lower than core, but not closer). Base VID will stop scaling once the highest turbo boost (1/2 core) multiplier is used, because the CPU's are not calibrated to exceed the max turbo boost. (so this is x50 for 9900K and x49 for 9700K), and at most lower multiplier steps, the base VID will decrease sharply at most steps, down to 800 mhz. Setting cache closer than 300 mhz or even higher than core ratio skews this.

VID also decreases by 1.5mv every 1C temp drop below 100C, so a 150mv range between 0C to 100C. This is called "Thermal Velocity Boost" (not the same thing as the laptop version, which affects turbo boost rather than VID), and this stops happening below a x40 multiplier.

AC Loadline is the CPU's power supply based on resistance (mOhms), which is used as the VRM target voltage signal on Auto (or DVID) voltages. This is based on load (current), so the VID target request will be boosted higher at load than at idle, and the heavier the load, the higher the target voltage request will be. This is limited to a maximum of 1.520v (Max VID on the Intel spec doc sheet). Now if this sounds like loadline calibration in reverse, it does seem like it indeed. But the difference is that this functions on the CPU's *requested* voltage, not on the vdroop on the requested voltage that the VRM is actually supplying (which is called VRM loadline, or "DC Loadline"). AC loadline (as far as I know) does not have a transient response (voltage spike/drop) penalty like VRM loadline does, because this is just a base voltage request. Although it's probably impossible to read the data what the VRM is receiving from the CPU in real time (oscilloscope speed here), but it can't be anywhere near what happens with the VRM voltage line (with all the transients at higher loadline calibration). Think of this as the "Fixed" voltage changing dynamically based on current draw. Since vdroop is going to be a nice huge healthy level (if LLC is left on the lowest level), you gain a huge stability benefit here.

The VRM ignores this on fixed override voltages but VID is still affected. This bias after AC Loadline factored in is the target voltage the CPU will request from the VRM directly. DVID offset will then be directly applied to this (+/-) if used.

If you want to see the theoretical VID request the CPU is doing, (there are no drawbacks to this--its fully safe), set DC Loadline manually to 1 (0.01 mOhms) in Internal VR Settings. You may be shocked at what you see.

While VRM Loadline is "DC" Loadline, the Internal VR "DC Loadline" DOES NOT CONTROL VDROOP FROM THE VRM!! It only controls VID droop on the CPU VID! The VRM IGNORES THIS VALUE! That's because DC Loadline is used for POWER MEASUREMENTS. The Intel spec documentation sheet states this. Basically, DC Loadline affects "VID" in the exact same way that VRM Loadline affects Vcore (VR VOUT). DC Loadline is used to calculate CPU Package Power because CPU Package Power is equal to VID * Amps.

VRM DC Loadline is actually "Loadline Calibration". That controls vdroop from the VRM and is shown as VR VOUT accurately. Unfortunately, there are only presets which don't show the mOhms values, but they are mOhms values. Standard and Normal are 1.6 mOhms. If VRM DC Loadline is set to the same value as DC Loadline (VR settings), when on auto vcore, VID and VR VOUT will be within 5 mv of each other. Using DVID offsets will change VR VOUT but not VID (remember I told you that DC Loadline does not affect the VRM at all?)

As you can see, it is VERY unwise to use a high AC Loadline (1.6 mOhms is maximum specification for 8 core CFL) and a high VRM loadline (loadline calibration)--this will cause dangerous voltages on auto vcore. If you want to see why, do what I said above, which I will repeat again---set DC Loadline to 1 (0.01 mOhms) and watch the CPU VID.

On auto vcore, assuming you are very intelligent and kept maximum vdroop enabled (1.6 mOhms, Loadline calibration=Standard / Normal) on Auto vcore, you may notice that the CPU VID and VR VOUT (drop at load compared to idle, and as the temps rise (temps rise=current goes up), VID and VR VOUT drop even more. This is assuming you set DC Loadline to match VRM Loadline (so, DC Loadline=1.6 mOhms (160) if you have Vcore Loadline Calibration=Standard/Normal. LLC=Low is 1.3 mOhms, equal to DC Loadline=130).

But what was that about VID rising 1.5mv every 1C temp increase? (and VR VOUT should match VID on auto voltage, so if you set Vcore Auto, LLC=Standard, AC Loadline=160, DC Loadline=160 (1.6 mOhms), why is VID and VR VOUT dropping here?

That's because the VID is capped at 1.520v (again check the experiment with your fixed vcore, and DC Loadline set to 1), and its reaching that as soon as a load is put on the processor. This is BEFORE DC Loadline drops the VID afterwards. So the VID cannot rise past 1.520v as temps go up. Just vdroop (1.6 mOhms * Amps= vdroop in millivolts) keeps dropping the VID and VR VOUT if on auto voltage.

If SVID OFFSET is enabled, this allows VID to exceed 1.520v by up to 200mv (Intel doc sheets also specify this as "Offset capability" via a MSR (do NOT confuse this with DVID!!!!). So if this were enabled, you would see VR VOUT and VID *RISE* at load as temps go up, since the 1.520v cap is removed, so VID could slowly increase as temps goes up, up to 1.72v. SVID Offset is only useful when AC Loadline is set to a low value (like 90). This allows you to reduce your idle voltage quite a bit, while keeping the load voltage the same or similar, as having SVID Offset disabled and AC Loadline=160. Obviously it is DANGEROUS (once again) to use any sort of Vcore Loadline Calibration on auto voltages when AC Loadline is doing the voltage boosting work for you.

[email protected] ghz, RX Vega 64, 32GB DDR4, Gigabyte Aorus Master, Seasonic Platinum 1000W, Corsair 760T
Alt: MSI GT73VR Throttlebook with 7820HK @ 4.7 ghz, GTX 1070 MXM TDP mod to 230W, 32 GB RAM

Last edited by Falkentyne; 08-11-2019 at 02:40 PM.
Falkentyne is online now  
post #4163 of 4248 (permalink) Old 08-11-2019, 03:21 PM
New to Overclock.net
 
robertr1's Avatar
 
Join Date: Sep 2013
Posts: 292
Rep: 13 (Unique: 9)
Quote: Originally Posted by Falkentyne View Post

The VRM ignores this on fixed override voltages but VID is still affected. This bias after AC Loadline factored in is the target voltage the CPU will request from the VRM directly. DVID offset will then be directly applied to this (+/-) if used.

If you want to see the theoretical VID request the CPU is doing, (there are no drawbacks to this--its fully safe), set DC Loadline manually to 1 (0.01 mOhms) in Internal VR Settings. You may be shocked at what you see.

On auto vcore, assuming you are very intelligent and kept maximum vdroop enabled (1.6 mOhms, Loadline calibration=Standard / Normal) on Auto vcore, you may notice that the CPU VID and VR VOUT (drop at load compared to idle, and as the temps rise (temps rise=current goes up), VID and VR VOUT drop even more. This is assuming you set DC Loadline to match VRM Loadline (so, DC Loadline=1.6 mOhms (160) if you have Vcore Loadline Calibration=Standard/Normal. LLC=Low is 1.3 mOhms, equal to DC Loadline=130).
That was really helpful! A lot of if it's still over my head I'm going to keep reading and learning. I did what you suggested and put some loads on it. It was crazy to watch the vid scale up with temp as you stated. This is back at 5.2ghz for cooling reasons btw. At 90c core load, the vid went upto 1.469 while the vrvout was around the same as before.

Is this temp scaling the reason why chips get more headroom as you can keep them cooler? example, if my chip was running at 45c, the vid would drop per your scaling factor mentioned. Does this mean, I'd have theoretically more headroom to clock higher? The relationship part is the piece I can't get my head around.
Attached Thumbnails
Click image for larger version

Name:	2019-08-11 (5).png
Views:	42
Size:	2.35 MB
ID:	288226  

robertr1 is offline  
Sponsored Links
Advertisement
 
post #4164 of 4248 (permalink) Old 08-11-2019, 03:53 PM
New to Overclock.net
 
Falkentyne's Avatar
 
Join Date: Dec 2007
Location: Riverside
Posts: 7,009
Rep: 429 (Unique: 298)
Quote: Originally Posted by robertr1 View Post
That was really helpful! A lot of if it's still over my head I'm going to keep reading and learning. I did what you suggested and put some loads on it. It was crazy to watch the vid scale up with temp as you stated. This is back at 5.2ghz for cooling reasons btw. At 90c core load, the vid went upto 1.469 while the vrvout was around the same as before.

Is this temp scaling the reason why chips get more headroom as you can keep them cooler? example, if my chip was running at 45c, the vid would drop per your scaling factor mentioned. Does this mean, I'd have theoretically more headroom to clock higher? The relationship part is the piece I can't get my head around.
Thermal Velocity boost VID scaling has nothing to do with this directly, but obviously the higher the temp, the more voltage may be needed (increasing electrical signal strength) for stability. TVB is not any sort of 1:1 ratio for when a chip will be stable. Chips simply can operate at a faster frequency the cooler they are. Heat causes the electrical signal to degrade. You're best off googling that semiconductor question as it's been explained in far more detail than anything I can ever do (remember I'm a gamer).

[email protected] ghz, RX Vega 64, 32GB DDR4, Gigabyte Aorus Master, Seasonic Platinum 1000W, Corsair 760T
Alt: MSI GT73VR Throttlebook with 7820HK @ 4.7 ghz, GTX 1070 MXM TDP mod to 230W, 32 GB RAM
Falkentyne is online now  
post #4165 of 4248 (permalink) Old 08-16-2019, 01:40 PM
⤷ αC
 
AlphaC's Avatar
 
Join Date: Sep 2012
Posts: 11,209
Rep: 911 (Unique: 592)
Quote:
Between the overbuilt VRM and hefty heatsink, the ASUS ROG MAXIMUS XI HERO (WiFi) thwarted my best attempts to make it thermal throttle. It was a good ten degrees hotter than the ASRock Z390 Phantom Gaming X, the other Z390 board I have tested with this method. As both boards are far below their thermal limits, this is largely an arbitrary difference (the Phantom Gaming X is also US$40 more expensive).
https://www.techpowerup.com/review/a...o-wifi/15.html
5.1 GHz CPU, 3866 MHz Memory
Load Power:233 W
VRM temp from graph ~69°C


Let's not forget he also tested the Phantom Gaming 7 at 5.1 GHz CPU, 3866 MHz Memory , which is going for ~$185:
https://www.techpowerup.com/review/a...ming-7/15.html
Load Power:254 W
VRM Temperature:57.9°C

Phantom Gaming 9 ( 5.1 GHz CPU, 3866 MHz Memory ) :
https://www.techpowerup.com/review/a...ming-9/15.html
Load power 260 W
VRM temp: 50.0°C

Taichi (5.1 GHz CPU, 3733 MHz Memory):
Load Power:251 W
VRM Temperature:52.0°C

► Recommended GPU Projects: [email protected] , [email protected] (FP64) (AMD moreso) ► Other notable GPU projects: [email protected] (Nvidia), GPUGrid (Nvidia) ► Project list



Last edited by AlphaC; 08-16-2019 at 01:51 PM.
AlphaC is offline  
post #4166 of 4248 (permalink) Old 09-04-2019, 08:09 AM
New to Overclock.net
 
CiTay's Avatar
 
Join Date: Mar 2019
Posts: 7
Rep: 0
I have a MSI MEG Z390 ACE, and i've read a couple posts in this thread about its VRM with interest. Specifically, there was speculation about the CPU VRM switching frequency, which was thought to be a low 300 kHz to keep switching losses and temperature under control. But, the lower the switching frequency, the worse the transient response and the higher the ripple, in theory.


Quote: Originally Posted by AlphaC View Post
ACE might perform on the Taichi level thermally but that's only if the heatsink is as good. Also that doesn't account for the switching frequency : the switching frequency on a powerblock or powerstage is typically 500kHz , which is nearly double that of a typical Powerpak implementation (~300kHz).


If I had to take a guess then I would think the z390 Ace performs about as well as a z390 Taichi in terms of thermals but worse in terms of ripple (due to300 kHz default switching frequency vs 500kHz). With 160A output you're looking at anywhere from 1.2 to 2W of heat output per high side fet depending on switching frequency due to the high side fet being a slow switching one.



I can confirm that the VRM on the Z390 ACE isn't very efficient, i'm at about 40W power consumption in idle, that's with one HDD and three SSDs, an NVIDIA 1660 Ti, 9600K non-OC and 2x 8 GB DDR4 1.2V. With a more mid-range board, one would expect to be close to 30W with this configuration, or even slightly lower.

However, i don't think the default CPU VRM switching frequency is 300 kHz. In the BIOS, apart from AUTO, there is a range of 350 kHz, 400 kHz, 500 kHz, and so on up to 1000 kHz. When i manually set it to 350 kHz, power consumption in idle lowers by ~3W, as well as for any other loads, it's always around 3W less (measured with an energy meter). The VRM also stays cooler than before. Even at 400 kHz, i'm still getting a good 2W lower power draw than with AUTO setting.

This leads me to believe that MSI might use a switching frequency of 500 kHz by default, perhaps even a bit higher. I can't 100% confirm this yet, as i'm still stability testing at 400 kHz manual setting.

Question: When i can save power and have lower temps, and everything is stable, is there a reason NOT to use a 350 kHz VRM switching frequency? I don't plan to overclock (except add RAM that runs on DDR4-3200).



In this video about the Gigabyte Z390 Aorus Master with the same IR35201, the guy says that it doesn't make much sense to go below 400 kHz switching frequency:
https://www.youtube.com/watch?v=6J7qnr0YNH8#t=11m54s (around 13:00 mins).
Then at 14:05 he says it might be best to just run it at 300 kHz switching frequency if that has the best VRM efficiency...




edit: Another thing, this guy here tested the Z390 ACE and used a thermal imaging camera to check MOS and cooler temperatures:
https://www.igorslab.media/msi-meg-z...de-igorslab/3/

He says, there is an unusually high temperature delta between MOS and the cooler, up to 40°C difference with high overclocking. He could see almost 105°C near the MOS in high OC, but the cooler stays much... cooler. And the heatpipe stays suspiciously cool! He suspects they might have used a "hollow tube in heatpipe look" instead of a proper heatpipe, because based on other boards with heatpipes, the heatpipe should be much hotter with a maximum delta temp of 10°-20°C, but not way more than 40°C delta.

So in conclusion, this board seems overpriced. The VRM is capable, but not very efficient. The cooling is not the best either (and the coolers need a lot more fins in them, come on MSI!). Good thing i got the board for free.

Last edited by CiTay; 09-04-2019 at 03:11 PM.
CiTay is offline  
post #4167 of 4248 (permalink) Old 09-13-2019, 11:13 AM
New to Overclock.net
 
chibi's Avatar
 
Join Date: Feb 2017
Posts: 783
Rep: 35 (Unique: 30)
With the mindset that Core is King -> Memory -> Cache in terms of overclocking performance. I would like to inquire about the ASUS Maximus Gene XI & Apex XI.

Does the Gene XI lack in the VRM department when compared with the Apex XI?

CPU to be used will be 9900K with full custom watercooling.

Unit 02
(12 items)
CPU
Intel Core i7-8700K
Motherboard
ASUS Maximus X Apex
GPU
NVIDIA Titan Xp
RAM
GSkill Trident Z - 16GB
Hard Drive
Samsung 960 PRO - 512GB
Power Supply
PRIME Ultra 1000 Titanium
Cooling
EK-CoolStream PE 360
Case
Lian Li PC-O11
Monitor
Dell AW3418DW
Keyboard
Corsair K70 RGB
Mouse
SteelSeries Rival 300
Audio
AKG K7XX
▲ hide details ▲

Last edited by chibi; 09-13-2019 at 11:28 AM.
chibi is offline  
post #4168 of 4248 (permalink) Old 09-14-2019, 06:43 AM
New to Overclock.net
 
robertr1's Avatar
 
Join Date: Sep 2013
Posts: 292
Rep: 13 (Unique: 9)
Quote: Originally Posted by chibi View Post
With the mindset that Core is King -> Memory -> Cache in terms of overclocking performance. I would like to inquire about the ASUS Maximus Gene XI & Apex XI.

Does the Gene XI lack in the VRM department when compared with the Apex XI?

CPU to be used will be 9900K with full custom watercooling.
Gene/Apex/Dark all trade blows at the top: https://hwbot.org/benchmark/cinebenc...=0#interval=20
robertr1 is offline  
post #4169 of 4248 (permalink) Old 09-15-2019, 02:25 PM
New to Overclock.net
 
D-EJ915's Avatar
 
Join Date: Aug 2011
Posts: 337
Rep: 13 (Unique: 12)
Quote: Originally Posted by chibi View Post
With the mindset that Core is King -> Memory -> Cache in terms of overclocking performance. I would like to inquire about the ASUS Maximus Gene XI & Apex XI.

Does the Gene XI lack in the VRM department when compared with the Apex XI?

CPU to be used will be 9900K with full custom watercooling.
From what I can tell the Gene and Extreme boards both have the same vrm. Since they are different boards there are probably some differences in performance like the Gene oc memory better but yeah. Apex XI has bigger VRM but also has no GPU output if you wanted that. I think in most cases you wouldn't really notice much difference to be honest so pick which board has features you like best.

https://www.overclock.net/forum/atta...2&d=1548963030

I think my old apex ix looks better than the apex xi https://imgur.com/a/WWRUkCA

Kanna
(13 items)
Dark X299
(7 items)
Mitty
(11 items)
CPU
Intel i7-7820X
Motherboard
Asrock X299 OC Formula
GPU
Asus Strix 1080 Ti
RAM
G.Skill DDR C14 3200MHz XMP 4x8GB
Hard Drive
Intel P3520 NVMe PCIE AIC 1.2TB
Power Supply
Seasonic X-1250 1250 Watt PSU
Cooling
NZXT Kraken 62
Case
NZXT S340
Operating System
Windows 10 x64
Monitor
Asus PA329Q 32" 4k 60Hz
Keyboard
Corsair K70 RGB Lux
Mouse
Logitech G600
Audio
HDMI to my Yamaha CX-A5000 preamplifier and studio monitors.
CPU
Intel i9-9900X
Motherboard
EVGA X299 Dark
GPU
EVGA Geforce GTX 1080 Ti
RAM
G.Skill 4x8GB C14 3200MHz
Hard Drive
Samsung Pro 840 250 GB
Power Supply
EVGA Supernova G3 1300
Cooling
Phanteks PH-TC14PE
CPU
Intel i7-3930k
Motherboard
Asrock Fatal1ty X79 Champion
GPU
EVGA GTX 780 Ti
RAM
Mushkin Redline
Hard Drive
WD Velociraptor
Optical Drive
Asus DVD±RW
Power Supply
Seasonic X-1250
Cooling
Corsair H100i V2
Case
Fractal ARC MIDI
Operating System
Windows 7 Pro x64
Audio
Asus Xonar DX
▲ hide details ▲

Last edited by D-EJ915; 09-15-2019 at 02:46 PM.
D-EJ915 is offline  
post #4170 of 4248 (permalink) Old 09-15-2019, 04:06 PM
New to Overclock.net
 
ahnafakeef's Avatar
 
Join Date: Jun 2012
Location: Dhaka, Bangladesh
Posts: 2,009
Rep: 63 (Unique: 46)
Aiming for 5GHz on all cores on a 9900K(/KF) cooled by a Cooler Master ML360 and I was looking at ASUS Maximus XI boards. But apparently they're not as good for overclocking? (Aorus Master for the price of the Hero is as good as Maximus XI Extreme?)

Will I benefit from a beefier VRM than on the Hero for my fairly not-so-ambitious overclock goals?
Attached Thumbnails
Click image for larger version

Name:	z390 rev9.png
Views:	247
Size:	164.5 KB
ID:	295692  


4K60
(16 items)
CPU
Intel Core i9-9900K
Motherboard
ASUS Maximus XI Formula
GPU
ASUS ROG Strix GeForce RTX 2080Ti
RAM
Corsair Dominator Platinum RGB 32GB (2x16GB)
Hard Drive
Samsung 970 Pro 512GB
Hard Drive
Samsung 970 Evo Plus 1TB
Hard Drive
Seagate Exos X16 16TB
Hard Drive
Seagate Exos X16 16TB
Power Supply
Corsair AX1200i
Cooling
Corsair Hydro Series™ H115i RGB PLATINUM 280mm
Case
Corsair Crystal Series 680X
Monitor
LG OLED C9 65"
Keyboard
Corsair K63 Wireless
Mouse
Logitech G Pro Wireless
Mousepad
Steelseries QCK Heavy XXL
Audio
SVS Ultra Home Theater 5.1
▲ hide details ▲
ahnafakeef is offline  
Reply

Tags
K6 Motherboard , Msi , voltage regulator , z370 vrm , z390 , z390 vrm

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off