Composed by Galina Vitkova
Two years ago, on 11 March 2011, in Japan the strongest earthquake accompanying by tsunami occurred. The earthquake and tsunami waves (the maximum wave height made 40.5 metre) caused widespread devastation across a large part of Japan. More than 14,000 lives lost. In addition to this, at least 10,000 people remain missing. Many more inhabitants were displaced from their homes because towns and villages were destroyed or swept away.
The tsunami caused the serious accident at the Nuclear Power Plant Fukushima-1(or NPP Fukushima Daiichi).
The NPP Fukushima-1 is located near Okum city in Fukushima prefecture. The NPP was built in 1960–1970 and is operated by the Tokyo Electric Power Company (TEPCO). The NPP is equipped with six nuclear units of total capacity 4.7 GW.
Earthquake and accident
The accident at the NPP Fukushima-1 occurred practically immediately after the earthquake and tsunami. The reactors in operation were shutdown. After that the external electricity feed disappeared too. The wave submerged reserve diesel generators, as a result of it the reactor cooling systems of Units 1, 2, 3 failed to function. Active zones of these reactors were melted.
In the wake of the reaction between zirconium and water vapour the hydrogen formed. It leads to a series of explosions and demolition of buildings, where the reactors are installed.
Units 5 and 6 were not destroyed as their diesel generator kept intacted. With the help of it two reactors and two spent nuclear fuel pools were managed to be cooled.
As a consequence of the NPP accident radioactive substances, among them iodine 131 (with very short half-life period) and ceasium 137 (with 30 years long half-life period), were emitted into the atmosphere and the see. On the site a few of plutonium was also found out. The radioactive contamination of the marine environment occurred by aerial deposition and by continuing discharges and outflow of water with a various level of radioactivity from the four damaged reactors at the NPP.
Total quantity of radioactive releases made 20 % of emissions after the Chernobyl disaster. In order to reduce the external exposure to the population beyond a distance of 30 km from the Fukushima-1 inhabitants were evacuated from this area . The contaminated land area, which should be deactivated, makes 3 % of the Japan territory.
Radioactive substances were revealed in drinking water and food not only in Fukushima prefecture, but in the other regions of Japan, too. Many countries including Russia banned to export of Japanese products. Following the accident at the Fukushima NPP on 11 march 2011, the European Union approved the Implementing Regulation of 26 October 2012. The Regulation imposed special conditions governing the import of feed and food originating in or consigned from Japan .The controls performed at import show that these special conditions are correctly implemented by the Japanese authorities. A next review of the Regulations is foreseen to be available by 31 March 2014.
For the first time after the Chernobyl accident in April 1986 the reputation of nuclear power was damaged so seriously. The world put on considering cap if nuclear power could be safe. Many countries blocked projects in this industry. Germany declared that by 2022 it will shut down the last NPP and will develop renewable sources of electricity production.
Removal of the accident impact
In compliance with the government of Japan intentions the full removal of the consequences of the accident at the NPP Fukushima-1 will take 30 to 40 years. In December 2011 cool shutdown of reactors was completed. After that the work on extracting of spent nuclear fuel from the spent nuclear fuel pools commenced. Then the nuclear fuel from entire reactors is supposed to be extracted. After that the complete demolition and decommission of the NPP technological equipment should be performed.
On 18 March 2013 in the evening a new accident caused by failure of cooling systems of spent nuclear fuel pools of Units 1, 3, 4 occurred. It happened after power outage at the NPP Fukushima-1. On 19 March the company ТЕРСО managed to put into run the cooling system of Unit 1. Nevertheless, troubles and problems in the cooling systems of Units 3, 4 and in the common pool continue till now.
Nuclear power produces about 14 % electricity production in the world.
In Japan the production of electricity in the year 2007 before Fukushima accident made 264 TWh with the installed capacity of 49 GW (i.e. 23,5 % of total Japanese NPP installed capacity).
In Germany electricity produced with NPP installed capacity of 20 GW (23,5 % of total German installed capacity) in the year 2007 amounted to 141 TWh. You can see changes in Germany energy policy after the Fukushima accident in Energy policy of Germany after Fukushima,and in Германия после Фукусимы.
You can find more details about about world producers of electricity in
Statistics on nuclear power.
If you are tired by studying figures, take a rest and fix your eyes on the picture below. Enjoj!
NPP – Nuclear Power Plant
ТЕРСО – Tokyo Electric Power Company
TWh – terawatt hour
GW – gigawatt
http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2012:299:0031:0041:EN:PDF , Official Journal of the European Union
http://www.aif.ru/society/,March 2013, Аргументы и Факты, март 2013
Dear friend of Technical English!
This once we will discuss hydropower. Hydro accounted for 16% of global electricity consumption. In 2009/2010 11 000 hydro power plants (HPPs) in 150 countries were generating electricity. The total electricity generated by HPPs in 2009 reached 3 329 TWh, 16.5% of global electricity production.
So, pay attention and make comments to the technical text below. Enjoy!
Sayano-Shushenskaya Hydro Power Plant
The Sayano–Shushenskaya hydro power plant is the largest power plant in Russia by the installed capacity and the sixth-largest operating hydro power plant in the world at present. The HPP is part of the Yenisei Cascade that is located in the territory of the Krasnoyarsk district and the Republic of Khakassia. It comprises three stages:
All hydro power plants were designed by the institute „Lengidroproject“. The main consumers of electricity produced by the HPPs are undertakings of the aluminium smelter company „Rusal“. The considerable part of produced electricity is delivered to the energy system of Siberia.
At the Sayano–Shushenskaya HPP 10 hydropower turbine and generator units are installed each of capacity 640 MW at 194-metre (636 Ft) head. The efficiency coefficient of a turbine in an optimal zone makes about 96%, the total weight of a turbine is 1440 t..
The turbines power up 10 hydrogenerators, equipped with a rotor of 10.3 m diameter, generating electricity at 15.75 kV. The turbines were manufactured at the “Leningradsky Metallichesky Zavod”, whereas the works “Electrosila” manufactured and supplied generators.
One generating power unit consists of two
neighbour hydropower turbine and generator units and runs for one group of three one-phase transformers each of capacity 5333 MWA at voltage 15.75/500 kV (in total 15 main transformers are installed at the HPP).
The open switchyard at 500 kV is located in the distance of 1.2 km lower along the current of the Yenisei river. The electricity is delivered from the HPP to the switchyard through three electricity transmission lines along the left bank of the river and two transmission lines using the long-span support placed on the rock excavation on the right bank. The electric power from the switchyard to the energy system is provided through four transmission lines at 500 kV.
The station constructions include the concrete arch-gravity dam, an administrative power plant building located near the dam, and an additional spillway. The dam of 245.5 metres (805 ft) high raises up the maximum head of 220 metres (720 ft). Water pressure for the dam is approximately 30 million tons, 60% of which is neutralized by the dam own weight and 40% is carried to rock on the bank.
The dam is constructed to “safely” withstand earthquakes up to 8 on the Richter scale. It was recorded by the Guinness Book of World Records for the strongest construction of its type.
The dam supports the Sayano-Shushenskoye reservoir, with a total capacity of 31.34 km3, useful capacity of 15.34 km3 and surface area of 621 km2 (240 sq mi).
PS: You can find unknown words and expressions from the area of power engineering (including hydro power) in TrainTE Vocabulary (Power engineering: English–Russian-Czech vocabulary and in Technical English Vocabulary – power engineering (Russian–English–Czech vocabulary).
enwikipedia.org – http://en.wikipedia.org/wiki/%D0%A1%D0%B0%D1%8F%CC%81%D0%BD%D0%BE-%D0%A8%D1%83%CC%81%D1%88%D0%B5%D0%BD%D1%81%D0%BA%D0%B0%D1%8F_%D0%B3%D0%B8%D0%B4%D1%80%D0%BE%D1%8D%D0%BB%D0%B5%D0%BA%D1%82%D1%80%D0%BE%D1%81%D1%82%D0%B0%CC%81%D0%BD%D1%86%D0%B8%D1%8F
The ProjectITER is a large-scale international scientific project intended to prove the practicability of nuclear fusion as an energy source. ITER was originally an acronym for the International Thermonuclear Experimental Reactor, but at present it is not considered an official abbreviation, but is connected with the Latin word “Iter” that means “way”, “journey”, “direction”.
The project is expected to collect the data necessary for the design and operation of the first electricity-producing fusion power plant. As known all nuclear power plants (NPPs) currently operating through over the world produce electricity from fission accompanied by high-level and long-life radioactive waste, which causes great protests of common people against these NPPs.
Just remind the ITER Agreement was officially signed at the Elysée Palace in Paris on 21 November 2006 by Ministers from the seven ITER Members (China, theEuropean Union, India, Japan, Korea, Russia and theUnited States) in the presence of French President Jacques Chirac and the President of the European Commission José Manuel Barroso. This Agreement established a legal international entity to be responsible for construction, operation, and decommissioning of ITER. The seven ITER Members have shared in the design of the installation, the creation of the international project structure, and in its funding.
On 24 October 2007, after ratification by all Members, the ITER Agreement entered into force and officially constituted the ITER Organization. ITER was originally expected to cost approximately €5 billion. However, the rising price of raw materials and changes to the initial design have augmented that the amount more than triple, i.e. to €16 billion. Necessary to add that ITER members make 90% of their contribution in kind, i.e. they contribute by equipment. It means the members produce appropriate devices and fund them into the project. The remaining 10% of the contribution are paid in cash by the members. Russia undertook obligations to manufacture 18 high technology systems for the project ITER.
The program is anticipated to last for 30 years – 10 for construction, and 20 years of operation. The reactor is expected to take 10 years to build with completion in 2018 (according to some sources in 2020). The ITER site in Cadarache, France stands ready: in 2010, construction began on the ITER Tokamak and scientific buildings.
At the end of October 2012 in the Saint Petersburg Research institute of electrophysical devices named after D.V.Efremov the first tests of the unique equipment within the project ITER were launched.
The components of the diverter target prototype of the ITER reactor faced to plasma are being tested (the details about the diverter target can be found in Вольфрамовая облицовка диверторной мишени для …). A proprietary test facility IDTF (ITER Divertor Test Facility) has been built up for testing. The facility enables to expose the ITER components to the same thermal burden as during operation and maintenance of the experimental reactor. The plasma temperature is supposed to grow up to 100 - 150 mil. degrees and expected heat loading on the diverter surface will rise up to 20 MW/m2. That is why the components under tests shall comply with the very strict requirements.
The components to be tested on the Russian facility have been produced in Japan. The testing is held in the presence of the ITER Agencies of Russia and Japan representants as well as with participation of the ITER International Organisation specialists.
The conclusions about the test results are expected to be made by the end of November 2012. It will be the first of numerous series of tests and trials the results of which will enable to master well-proven technology of manufacturing the ITER components.
PS: The technical terms on the topic can be found in TrainTE Vocabulary (Power engineering: English–Russian-Czech vocabulary) and in Vocabulary – power engineering (Russian–English–Czech). PPS: The Russian version of the article titled Проект ИТЭР в реализации is published at the blog Technical English Remarks.
Dear friend of Technical English,
There are several posts at this blog devoted to power engineering. In order to gain more details concerning the topic visit About the blog. There you will find the full list of such posts. Because of interest to the subject another new technical text is offered below for studying and discussing. Improve your Technical English, enrich your vocabulary in this area, write comments expressing your opinion about the future of electricity supply. Indicate your opinion on renewables VS nuclear power plants. Uncover new technical terms in TrainTE Vocabulary
Composed by Galina Vitkova
Before FukushimaGermany had ambitious energy targets. Its Energy Concept 2010 approved an extension of the operating times of the 17 German nuclear power plants (NPPs) as a bridging technology for renewable energy supply.
The energy and climate package of 26 November 2010 (Energy Concept 2010) comprised four key elements:
Legislative changes following the Fukushima nuclear accident in Japan in 2011 (energy policy shift 2011) stopped the nuclear extension. An amendment of the Atomic Energy Act (AtG) stipulated the immediate shutdown of eight power plants and set down a phase-out of the remaining nine nuclear power plants until 2022.
Under these circumstances the German Energy Agency (dena) presented a new study examining the consequences of the German energy policy shift and challenges lying ahead. The Agency predicts that electricity prices will considerably rise until 2050 and conventional power plants will still be needed to a large extent to ensure the security of supply and balance relative to the increasing amount of intermittent renewable energy input.
The intermittence of photovoltaics (PV), for instance, is illustrated on http://www.sma.de/en/company/pv-electricity-produced-in-germany.html, Performance of Photovoltaics (PV) in Germany. On the site you can see at any time the total output of all PV plants in Germany installed up to the specified cut-off date. At present the total installed capacity of PV plants in Germany amounts to 29 GW. The examples of their generation in profit (in 2012) and low (in 2011) yield days are given in the table below (see also Intermittence of renewables).
Low yield days
Profit yield days
|01.01.2011||3 GWh||0.1 hours daily||25.05.2012||179 GWh||6 hours daily|
|17.03.2011||8 GWh||0.28 hours daily||24.05.2012||165 GWh||6.7 hours daily|
|30.07.2011||37 GWh||1.28 hours daily||27.06.2012||119 GWh||4 hours daily|
The study was carried out related to the Germany’s target to increase the share of renewable energy sources in the electricity supply to at least 80% until 2050. When preparing the study dena cooperated with the RWTHAachenUniversity. The study was accepted by RWE AG.
According to the study the installed power capacity in Germany will amount to 240 GW in 2050 in total, with 170 GW of renewable power plants and 61 GW provided by conventional fossil-fuelled power plants. It means that conventional capacity will only decrease by 37% compared with 2010. By 2050 efficient gas and coal-fired power plants will provide roughly 60% of secure electricity supply, whereas renewable power plants deliver 24%.
To ensure the security of electricity supply 49 GW of new conventional power plant capacity is needed preferably by 2020, at the latest by 2030.
According to dena unless additional power plants are built, Germany will import approximately 134 TWh or 22% of the electricity consumed by 2050.
In view of the above need for new conventional power plants and possible imports, the expansion of the grid infrastructure including the connection of offshore wind farms, spinning reserve energy and new storage capacity, enlargement of the existing distribution and transmission grids electricity prices will greatly rise until 2050.
As of 2020 it will increasingly come to situations in which the renewable power production exceeds the demand. The excess of renewable electricity for which there is no demand in Germany or abroad may reach 66 TWh or 15% of the electricity generated in Germany until 2050. So, without a new market design, renewables would not be competitive by 2050. The Agency therefore demands a complete overhaul of the EEG that promotes the input of renewable energy into the German grids by granting fixed feed-in tariffs. These tariffs are higher than the electricity prices at the exchanges. For these reasons it proposes a European capacity market to encourage and stimulate investments in power plants that provide secure capacities.
AbbreviationsAtG – Atomic Energy Act (Atomgesetz – AtG) billion – milliard dena – German Energy Agency EEG – Germany’s Renewable Energy Sources Act EKFG – Energy and Climate Fund (In Germany) NPP – nuclear power plant PV – photovoltaics
Composed by Galina Vitkova
Dear friends of Technical English,
I have three blogs, all of them are related to Technical English and all of them are not popular with search engines. So I decided to study the topic carefully to make search engines to catch sight of my blogs and send more visitors to them. Concurrently I desired to prepare an interesting technical text apposite for studying Technical English and discussing the issue.
First, I have tried to sum up current technical terms to make them quite clear for me and possible visitors of my blogs. Unfortunately I met some terms, which I could not find explanation for.
Further, I have again looked through how search engines work to be better at getting my blogs to the top of search results. Finally, I have chosen from studied materials those recommended steps that I am able to accomplish in order to put keyword in right places. I wonder if my new knowledge will attract more visitors to this post.
A keyword is a word used to make a search. Elsewhere, instead of a keyword a query or a tag is used too. Of the billions of made searches you need to decide which ones you want your site to come top of the Search Engine Results Pages (SERPs) for.
A keyphrase is a collection of words used to make a search (an equivalent to a keyword).
A target keyword is such a keyword which will bring your site to the top of a SERP. There are some online tools that can help you find and choose your target keywords.
A head keyword carries the highest volume of search engine visits. It is also called a primary keyword.
The long tail of keywords can be created by different combinations of head keywords and the number of such combinations is almost endless. The more numerous is your tail of keywords, the more is a number of your site visits by search engines. Furthermore, the long tail offers more potential for profit than the head ones.
A keyword niche is a group of keywords containing a single ‘seed’ keyword. So we target groups of keywords, i.e. a keyword niche.
A primary keyword is a keyword that has the highest volume of search engine visits (it is searched for more frequently) and is the head keyword. It is the most popular keyword that has the most potential to attract traffic.
A secondary keyword has lower volumes (it is searched for less often).
Usually you choose one primary keyword, but you might also pick two or more secondary keywords.
Understanding how search engine work helps getting your website to the top of the SERPs. See e.g. the video on http://www.google.com/competition/howgooglesearchworks.html about how Google performs.
Every day Google answers more than one billion questions from people around the globe in 181 countries and 146 languages. 15% of the everyday searches have not been seen before.
Let´s drop a look at main steps of Google search activities.
Crawling. Google visits billions of website pages and finds more and more pages by following (crawling) the links it uncovers on previous billions of pages.
Indexing. When visiting website pages Google stores the information about every found page in the index. Google’s index is like a huge filing system for all the pages it finds.
Matching. When a searcher starts searching for anything, for any item Google searches its index for all the pages containing the item. Typically, Google will find thousands, even millions, of matches for a search.
Ranking. Google uses over 200 factors to decide what order to display the matching pages. Each matching page is scored for each of the 200-plus factors and the scores totaled. The total score is then used to rank the matching pages and decide the order the results are presented on the SERPs (Search Engine Results Pages) for (highest at the top).
Ranking factors include (for each page) for instance:
The detailed guidance enlightenment explanation of the issue is given in http://www.wordtracker.com/academy/seo-made-simple.
PS: Find additional information about search engines in Search engine – essential information, December 29, 2011; Search Engines The presentation at the SKYPE conversation conference on 27th August 2008.
And now when you have chosen and set all keywords recommended by professionals you should put the keywords in appropriate places. There are many recommendations related to the issue. From what I have studied trying to attract search engines to my blogs I have learned it is mainly necessary to:
Is a link analysis algorithm used by the Google Internet search engine. The algorithm assigns a numerical weighting to each element of hyperlinked documents on the World Wide Web with the purpose of “measuring” its relative importance within it. According to the Google theory if Page A links to Page B, then Page A is saying that Page B is an important page. If a page has more important links to it, then its links to other pages also become more important.
PageRank was developed at the Stanford University by Larry Page (thus the term PageRank is after him) and Sergey Brin as part of a research project about a new kind of a search engine. Now the “PageRank” is a trademark of Google. The PageRank process has been patented and assigned to the Stanford University, not to Google. Google has exclusive license rights on this patent from the university. The university received 1.8 million shares of Google in exchange for use of the patent; the shares were sold in 2005 for $336 million.
The first paper about the project, describing PageRank and the initial prototype of the Google search engine, was published in 1998: shortly after, Page and Brin founded the company Google Inc. Even if PageRank now is one of about 200 factors that determine the ranking of Google search results, it continues to provide the basis for all of Google web search tools.
Since 1996 a small search engine called “RankDex” designed by Robin Li has already been exploring a similar strategy for site-scoring and page ranking. This technology was patented by 1999 and was used later by Li when he founded Baidu in China.
There is some basic information, which is needed to know for understanding PageRank.
First, PageRank is a number that only evaluates the voting ability of all incoming (inbound) links to a page.
Second, every unique page of a site that is indexed in Google has its own PageRank.
Third, internal site links interact in passing PageRank to other pages of the site.
Forth, the PageRank stands on its own. It is not tied in with the anchor text of links.
Fifth, there are two values of the PageRank that should be distinguished:
a. PageRank which you can get from the Internet Explorer toolbar (http://toolbar.google.com);
b. Actual or real PageRank that is used by Google for calculation of ranking web pages.
PageRank from the toolbar (sometimes called the Nominal Pagerank) has value from zero to ten. It is not very accurate information about site pages, but it is the only thing that gives you any idea about the value. It is updated approximately once every three months, more or less, while the real PageRank is calculated permanently as the Google bots crawl the web finding new web pages and new backlinks.
Thus, in the following text the term actual PageRank is employed to deal with the actual PageRank value stored by Google, and the term Toolbar PageRank concerns the evaluation of the value that you see on the Google Toolbar.
The Toolbar value is just a representation of the actual PageRank. While real PageRank is linear, Google uses a non-linear graph to show its representation. So on the toolbar, moving from a PageRank of 2 to a PageRank of 3 takes less of an increase than moving from a PageRank of 3 to a PageRank of 4.
This is illustrated by a comparison table (from PageRank Explained by Chris Ridings). The actual figures are kept secret, so for demonstration purposes some guessed figures were used:
If the actual PageRank is between
The Toolbar Shows
|0.00000001 and 5 6 and 25 25 and 125 126 and 625 626 and 3125 3126 and 15625 15626 and 78125 78126 and 390625 390626 and 1953125 1953126 and infinity||1 2 3 4 5 6 7 8 9 10|
Lawrence Page and Sergey Brin have published two different versions of their PageRank algorithm in different papers.
First version (so called the Random Surfer Model) was published on the Stanford research paper titled The Anatomy of a Large-Scale Hypertextual Web Search Engine in 1998:
PR(A) = (1-d) + d(PR(T1)/C(T1) + … + PR(Tn)/C(Tn))Where PR(A) is the PageRank of page A. d is a damping factor, which is set between 0 and 1, nominally it is set to 0.85. PR(T1) is the PageRank of a site page pointing to page A. C(T1) is the number of outgoing links on page T1.
In the second version of the algorithm, the PageRank of page A is given as:
PR(A) = (1-d) / N + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))Where N is the total number of all pages on the Web.
The first model is based on a very simple intuitive concept. The PageRank is put down as a model of user behaviour, where a surfer clicks on links at random. The probability that the surfer visits a page is the page PageRank. The probability that the surfer clicks on one link at the page is given by the number of links at the page. The probability at each page that the surfer will get bored and will jump to another random page is the damping factor d.
The second notation considers PageRank of a page the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages PageRanks will be one.
As for calculating PageRank the calculations by means of its first model are easier to compute because the total number of web pages is disregarded.
Dear friend of technical English,
Do you want to improve your professional English?
Do you want at the same time to gain comprehensive information about the Internet and Web?