Why Technical English

Project ITER in progress

December 16, 2012
Leave a Comment
Composed by Galina Vitkova 

The ProjectITER is a large-scale international scientific project intended to prove the practicability of nuclear fusion as an energy source. ITER was originally an acronym for the International Thermonuclear Experimental Reactor, but at present it is not considered an official abbreviation, but is connected with the Latin word “Iter” that means “way”, “journey”, “direction”.

English: deuterium-tritium fusion diagram, poi...

The project is expected to collect the data necessary for the design and operation of the first electricity-producing fusion power plant. As known all nuclear power plants (NPPs) currently operating through over the world produce electricity from fission accompanied by high-level and  long-life radioactive waste, which causes great protests of common people against these NPPs.  

The project is based on the Soviet-Russian technology tokamak (toroidal chamber with magnetic coils), which is a device using a magnetic field to confine plasma in the shape of a torus.

ITER is the culmination of decades of fusion research: more than 200 tokamaks (see also Nuclear powertokamaks) built over the world have paved the way to the ITER experiment.

Some History

Just remind the ITER Agreement was officially signed at the Elysée Palace in Paris on 21 November 2006 by Ministers from the seven ITER Members (China, theEuropean Union, India, Japan, Korea, Russia and theUnited States) in the presence of French President Jacques Chirac and the President of the European Commission José Manuel Barroso. This Agreement established a legal international entity to be responsible for construction, operation, and decommissioning of ITER. The seven ITER Members have shared in the design of the installation, the creation of the international project structure, and in its funding.

Fusion Power Grid

On 24 October 2007, after ratification by all Members, the ITER Agreement entered into force and officially constituted the ITER Organization. ITER was originally expected to cost approximately €5 billion. However, the rising price of raw materials and changes to the initial design have augmented that the amount more than triple, i.e. to €16 billion. Necessary to add that ITER members make 90% of their contribution in kind, i.e. they contribute by equipment. It means the members produce appropriate devices and fund them into the project. The remaining 10% of the contribution are paid in cash by the members. Russia undertook obligations to manufacture 18 high technology systems for the project ITER.

The program is anticipated to last for 30 years – 10 for construction, and 20 years of operation. The reactor is expected to take 10 years to build with completion in 2018 (according to some sources in 2020). The ITER site in Cadarache, France stands ready: in 2010, construction began on the ITER Tokamak and scientific buildings.

Testing

At the end of October 2012 in the Saint Petersburg Research institute of electrophysical devices named after D.V.Efremov the first tests of the unique equipment within the project ITER were launched.

Tokamak - Creating the Sun on Earth

 

The components of the diverter target prototype of the ITER reactor faced to plasma are being tested (the details about the diverter target can be found in Вольфрамовая облицовка диверторной мишени для ). A proprietary test facility IDTF (ITER Divertor Test Facility) has been built up for testing. The facility enables to expose the ITER components to the same thermal burden as during operation and  maintenance of the experimental reactor. The plasma temperature is supposed to grow up to 100 – 150  mil. degrees and expected heat loading on the diverter surface will rise up to 20 MW/m2.  That is why the components under tests shall comply with the very strict requirements.  

The components to be tested on the Russian facility have been produced in Japan. The testing is held in the presence of the ITER Agencies of Russia and Japan representants as well as with participation of the ITER International Organisation specialists.  

The conclusions about the test results are expected to be made by the end of November 2012. It will be the first of numerous series of tests and trials the results of which will enable to master well-proven technology of manufacturing the ITER components.

Diagram illustrating, in a schematic way, the ...

 

PS: The technical terms on the topic can be found in
TrainTE Vocabulary (Power engineering: English–Russian-Czech vocabulary) and in
 Vocabulary – power engineering (Russian–English–Czech).
PPS: The Russian version of the article titled Проект ИТЭР в реализации is published at the blog Technical English Remarks.

  References

 

Related articles

 


Website – basic information

November 28, 2011
7 Comments

Website and Its Characteristics

                                                                                             Composed by Galina Vitkova using Wikipedia

A website (or web site) is a collection of web pages, typically common to a particular domain name on the Internet. A web page is a document usually written in HTML (Hyper Text Markup Language), which is almost always accessible via HTTP (Hyper-Text Transport Protocol). HTTP is a protocol that transfers information from the website server to display it in the user’s web browser. All publicly accessible web sites constitute the immense World Wide Web of information. More formally a web site might be considered a collection of pages dedicated to a similar or identical subject or purpose and hosted through a single domain.

The pages of a website are approached from a common root URL (Uniform Resource Locator or Universal Resource Locator) called the homepage, and usually reside on the same physical server. The URLs of the pages organise them into a hierarchy. Nonetheless, the hyperlinks between web pages regulate how the reader perceives the overall structure and how the traffic flows between the different parts of the sites. The first on-line website appeared in 1991 in CERN (European Organization for Nuclear Research situated in the suburbs of Geneva on the Franco–Swiss border) – for more information see ViCTE Newsletter Number 5 – WWW History (Part1) / May 2009, Number 6 – WWW History (Part2) / June 2009.

A website may belong to an individual, a business or other organization. Any website can contain hyperlinks to any other web site, so the differentiation one particular site from another may sometimes be difficult for the user.

Websites are commonly written in, or dynamically converted to, HTML and are accessed using a web browser. Websites can be approached from a number of computer based and Internet enabled devices, including desktop computers, laptops, PDAs (personal digital assistant or personal data assistant) and cell phones.

Website Drafts and Notes

Image by Jayel Aheram via Flickr

A website is hosted on a computer system called a web server or an HTTP server. These terms also refer to the software that runs on the servers and that retrieves and delivers the web pages in response to users´ requests.

Static and dynamic websites are distinguished. A static website is one that has content which is not expected to change frequently and is manually maintained by a person or persons via editor software. It provides the same available standard information to all visitors for a certain period of time between updating of the site.

A dynamic website is one that has frequently changing information or interacts with the user from various situation (HTTP cookies or database variables e.g., previous history, session variables, server side variables, etc.) or direct interaction (form elements, mouseovers, etc.). When the web server receives a request for a given page, the page is automatically retrieved from storage by the software. A site can display the current state of a dialogue between users, can monitor a changing situation, or provide information adapted in some way for the particular user.

Static content may also be dynamically generated either periodically or if certain conditions for regeneration occur in order to avoid the performance loss of initiating the dynamic engine

Website Designer & SEO Company Lexington Devel...
Image by temptrhonda via Flickr

Some websites demand a subscription to access some or all of their content. Examples of subscription websites include numerous business sites, parts of news websites, academic journal websites, gaming websites, social networking sites, websites affording real-time stock market data, websites providing various services (e.g., websites offering storing and/or sharing of images, files, etc.) and many others.

For showing active content of sites or even creating rich internet applications plagins such as Microsoft Silverlight, Adobe Flash, Adobe Shockwave or applets are used. They provide interactivity for the user and real-time updating within web pages (i.e. pages don’t have to be loaded or reloaded to effect any changes), mainly applying the DOM (Document Object Model) and JavaScript.

There are many varieties of websites, each specialising in a particular type of content or use, and they may be arbitrarily classified in any number of ways. A few such classifications might include: Affiliate, Archive site, Corporate website, Commerce site, Directory site and many many others (see a detailed classification in Types of websites).

In February 2009, an Internet monitoring company Netcraft, which has tracked web growth since 1995, reported that there were 106,875,138 websites in 2007 and 215,675,903 websites in 2009 with domain names and content on them, compared to just 18,000 Web sites in August 1995.

 PS:  Spellingwhat is the better, what is correct: “website OR “web site?

The form “website” has gradually become the standard spelling. It is used, for instance, by such leading dictionaries and encyclopedias as the Canadian Oxford Dictionary, the Oxford English Dictionary, Wikipedia. Nevertheless, a form “web site” is still widely used, e.g. Encyclopædia Britannica (including its Merriam-Webster subsidiary). Among major Internet technology companies, Microsoft uses “website” and occasionally “web site”, Apple uses “website”, and Google uses “website”, too.

 PSS: Unknown technical terms you can find in the Internet English Vocabulary.

 Reference      Website – Wikipedia, the free encyclopedia

Have You Donated To Wikipedia Already?

Do you use Wikipedia? Do you know that Jimmy Wales, a foundator of Wikipedia, decided to keep Wikipedia advertising free and unbiased. So, they have financial problems with surviving now. Any donation, even a small sum is helpful. Thus, here’s the page where you can donate.

Dear visitor,  If you want to improve your professional English and at the same time to gain basic comprehensive targetted information about the Internet and Web, subscribe to “Why Technical English”.

Look at the right sidebar and subscribe as you like:

  • by Email subsription … Sign me up        
  • Subsribe with Bloglines
  • Subsribe.ru

Right now within preparing the e-book “Internet English” (see ViCTE Newsletter Number 33 – WWW, Part 1 / August 2011 ) posts on this topic are being published there. Your comments to the posts are welcome.

 Related articles

 


The Semantic Web – great expectations

October 31, 2011
3 Comments

By Galina Vitkova

The Semantic Web brings the further development of the World Wide Web aimed at interpreting the content of the web pages as machine-readable information.

In the classical Web based on HTML web pages the information is comprised in the text or documents which are read and composed into visible or audible for humans web pages by a browser. The Semantic Web is supposed to store information as a semantic network through the use of ontologies. The semantic network is usually a directed or undirected graph consisting of vertices, which represent concepts, and edges, which represent relations among the concepts.  An ontology is simply a vocabulary that describes objects and how they relate to one another. So a program-agent is able to mine facts immediately from the Semantic Web and draw logical conclusions based on them. The Semantic Web functions together with the existing Web and uses the protocol HTTP and resource identificators URIs.

The term  Semantic Web was coined by sir Tim Berners-Lee, the inventor of the World Wide Web and director of the World Wide Web Consortium (W3C) in May 2001 in the journal «Scientific American». Tim Berners-Lee considers the Semantic Web the next step in the developing of the World Wide Web. W3C has adopted and promoted this concept.

Main idea

The Semantic Web is simply a hyper-structure above the existing Web. It extends the network of hyperlinked human-readable web pages by inserting machine-readable metadata about pages and how they are related to each other. It is proposed to help computers “read” and use the Web in a more sophisticated way. Metadata can allow more complex, focused Web searches with more accurate results. To paraphrase Tim Berners-Lee the extension will let the Web – currently similar to a giant book – become a giant database. Machine processing of the information in the Semantic Web is enabled by two the most important features of it.

  • First – The all-around application of uniform resource identifiers (URIs), which are known as addresses. Traditionally in the Internet these identifiers are used for pointing hyperlinks to an addressed object (web pages, or e-mail addresses, etc.). In the Semantic Web the URIs are used also for specifying resources, i.e. URI identifies exactly an object. Moreover, in the Semantic Web not only web pages or their parts have URI, but objects of the real world may have URI too (e.g. humans, towns, novel titles, etc.). Furthermore, the abstract resource attribute (e.g. name, position, colour) have their own URI. As the URIs are globally unique they enable to identify the same objects in different places in the Web. Concurrently, URIs of the HTTP protocol (i.e. addresses beginning with http://) can be used as addresses of documents that contain a machine-readable description of these objects.

  • Second – Application of semantic networks and ontologies. Present-day methods of automatic processing information in the Internet are as a rule based on the frequency and lexical analysis or parsing of the text, so it is designated for human perception. In the Semantic Web instead of that the RDF (Resource Description Framework) standard is applied, which uses semantic networks (i.e. graphs, whose vertices and edges have URIs) for representing the information. Statements coded by means of RDF can be further interpreted by ontologies created in compliance with the standards of RDF Schema and OWL (Web Ontology Language) in order to draw logical conclusions. Ontologies are built using so called description logics. Ontologies and schemata help a computer to understand human vocabulary.

 

Semantic Web Technologies

The architecture of the Semantic Web can be represented by the Semantic Web Stack also known as Semantic Web Cake or Semantic Web Layer Cake. The Semantic Web Stack is an illustration of the hierarchy of languages, where each layer exploits and uses capabilities of the layers below. It shows how technologies, which are standardized for the Semantic Web, are organized to make the Semantic Web possible. It also shows how Semantic Web is an extension (not replacement) of the classical hypertext Web. The illustration was created by Tim Berners-Lee. The stack is still evolving as the layers are concretized.

Semantic Web Stack

As shown in the Semantic Web Stack, the following languages or technologies are used to create the Semantic Web. The technologies from the bottom of the stack up to OWL (Web Ontology Langure) are currently standardized and accepted to build Semantic Web applications. It is still not clear how the top of the stack is going to be implemented. All layers of the stack need to be implemented to achieve full visions of the Semantic Web.

  • XML (eXtensible Markup Language) is a set of rules for encoding documents in machine-readable form. It is a markup language like HTML. XML complements (but does not replace) HTML by adding tags that describe data.
  • XML Schema published as a W3C recommendation in May 2001 is one of several XML schema languages. It can be used to express a set of rules to which an XML document must conform in order to be considered ‘valid’.
  • RDF (Resource Description Framework) is a family of W3C specifications originally designed as a metadata data model. It has come to be used as a general method for conceptual description of information that is implemented in web resources. RDF does exactly what its name indicates: using XML tags, it provides a framework to describe resources. In RDF terms, everything in the world is a resource. This framework pairs the resource with a specific location in the Web, so the computer knows exactly what the resource is. To do this, RDF uses triples written as XML tags to express this information as a graph. These triples consist of a subject, property and object, which are like the subject, verb and direct object of an English sentence.
  • RDFS (Vocabulary Description Language Schema) provides basic vocabulary for RDF, adds classes, subclasses and properties to resources, creating a basic language framework
  • OWL (Web Ontology Language) is a family of knowledge representation languages for creating ontologies. It extends RDFS being the most complex layer, formalizes ontologies, describes relationships between classes and uses logic to make deductions.
  • SPARQL (Simple Protocol and RDF Query Language) is a RDF query language, which can be used to query any RDF-based data. It enables to retrieve information for semantic web applications.
  • Microdata (HTML)  is an international standard that is applied to nest semantics within existing content on web pages. Search engines, web crawlers, and browsers can extract and process Microdata from a web page providing better search results

As mentioned, top layers contain technologies that are not yet standardized or comprise just ideas. May be, the layers Cryptography and Trust are the most uncommon of them. Thus Cryptography ensures and verifies the origin of web statements from a trusted source by a digital signature of RDF statements. Trust to derived statements means that the premises come from the trusted source and that formal logic during deriving new information is reliable.


World Wide Web

September 3, 2011
Leave a Comment

Dear friends of Technical English,

I have just started publishing materials for my projected e-book devoted to the Internet English, i.e. English around the Internet. It means that during a certain period of time I will publish posts which will make basic technical texts in units of the mentioned e-book with a working name Internet English. The draft content of the e-book has already been published on my blog http://traintechenglish.wordpress.com in the newsletter Number 33 – WWW, Part 1 / August 2011. One topic in the list means one unit in the e-book.

Thus you find below the first post of a post series dealing with Internet English. I hope these texts will contribute to develop your professional English and at the same time will bring you topical information about the Internet.    Galina Vitkova

 

World Wide Web

 Composed by Galina Vitkova

The World Wide Web (WWW or simply the Web) is a system of interlinked, hypertext documents that runs over the Internet. A Web browser enables a user to view Web pages that may contain text, images, and other multimedia. Moreover, the browser ensures navigation between the pages using hyperlinks. The Web was created around 1990 by the English Tim Berners-Lee and the Belgian Robert Cailliau working at CERN in Geneva, Switzerland.

Today, the Web and the Internet allow connecti...

Today, the Web and the Internet allow connecti...

The term Web is often mistakenly used as a synonym for the Internet itself, but the Web is a service that operates over the Internet, as e-mail, for example, does. The history of the Internet dates back significantly further than that of the Web.

Basic terms

The World Wide Web is the combination of four basic ideas:

  • The hypertext: a format of information which in a computer environment allows one to move from one part of a document to another or from one document to another through internal connections (called hyperlinks) among these documents;
  • Resource Identifiers: unique identifiers used to locate a particular resource (computer file, document or other resource) on the network – this is commonly known as a URL (Uniform Resource Locator) or URI (Uniform Resource Identifier), although the two have subtle technical differences;
  • The Client-server model of computing: a system in which client software or a client computer makes requests of server software or a server computer that provides the client with resources or services, such as data or files;
  • Markup language: characters or codes embedded in a text, which indicate structure, semantic meaning, or advice on presentation.

 

How the Web works

Viewing a Web page or other resource on the World Wide Web normally begins either by typing the URL of the page into a Web browser, or by following a hypertext link to that page or resource. The act of following hyperlinks from one Web site to another is referred to as browsing or sometimes as surfing the Web. The first step is to resolve the server-name part of the URL into an Internet Protocol address (IP address) by the global, distributed Internet database known as the Domain name system (DNS). The browser then establishes a Transmission Control Protocol (TCP) connection with the server at that IP address.

TCP state diagram

TCP state diagram

The next step is dispatching a HyperText Transfer Protocol (HTTP) request to the Web server in order to require the resource. In the case of a typical Web page, the HyperText Markup Language (HTML) text is first requested and parsed (parsing means a syntactic analysis) by the browser, which then makes additional requests for graphics and any other files that form a part of the page in quick succession. After that the Web browser renders (see a note at the end of this paragraph) the page as described by the HyperText Markup Language (HTML), Cascading Style Sheets (CSS) and other files received, incorporating the images and other resources as necessary. This produces the on-screen page that the viewer sees.

Notes:

  • Rendering is the process of generating an image from a model by means of computer programs.
  • Cascading Style Sheets (CSS) is a style sheet language used to describe the look and formatting of a document written in a markup language.

 

Web standards

At its core, the Web is made up of three standards:

  • the Uniform Resource Identifier (URI), which is a string of characters used to identify a name or a resource on the Internet;
  • the HyperText Transfer Protocol (HTTP), which presents a networking protocol for distributed, collaborative, hypermedia information systems, HTTP is the foundation of data communication on the Web;
  • the HyperText Markup Language (HTML), which is the predominant markup language for web pages. A markup language presents a modern system for annotating a text in a way that is syntactically distinguishable from that text.

 


100% integration of renewable energies?

August 13, 2011
1 Comment

Composed by Galina Vitkova

The Renewables-Grid-Initiative (RGI) promotes effective integration of 100% electricity produced from renewable energy sources.

EnergyGreenSupply

Energy Green Supply

I do not believe in this statement RGI. I am sure that it is impossible from technical and technological points of view. Simply remind the very low share of renewables in entire production of world electricity (3% without hydroelectricity), very high investment costs and very high prices of electricity produced from renewables nowadays.

Concerns about climate and energy security (especially, in the case of nuclear power plants) are reasons supporting the efforts for a quick transformation towards a largely renewable power sector. The European emissions reduction targets to keep temperature increase below 2°C require the power sector to be fully decarbonised by 2050. Large parts of society demand that the decarbonisation is achieved predominantly with renewable energy sources.

Illustration: Different types of renewable energy.

Different types of renewable energy

Renewables advocates do not speak much about real solutions of real greatly complex problems of renewable sources. Very often they are not aware of them. Even if renewable energy technologies are now established and appreciated by officials and green activists as a key means of producing electricity in a climate and environment friendly way, many crucial problems remain unsolved. Additional power lines, which are needed for transporting electricity from new renewable generation sites to users, raise negative impact on the environment, including biodiversity, ecosystems and the landscape. Furthermore, electricity surpluses, produced by renewables when electricity consumption is very low, causes enormous problems with storage of these surpluses. Besides, there are serious problems with dispatch controlling of a power system with the great penetration (see Variability and intermittency of wind energy in Number 31 – Giving a definition / July 2011) of renewables. On the whole, three the most important problems are waiting to be solved and each of them demands massive investments:

  • building the additional electricity transmission lines in a great amount due to numerous and dispersed renewable sites;
  • accommodation of electricity storage needs in the case of electricity surpluses from renewables;
  • integration of intermittent sources of electricity production in scheduled control of power grids.

Thus, concerns about the impacts of renewables integration in European power systems need to be carefully studied, fully understood and addressed.

Let us closely consider the issues of building new transmission lines. In the coming decade thousands of kilometers of new lines should be built acrossEurope. Renewable energy sources are abundant and vary, but they are mostly available in remote areas where demand is low and economic activities infrequent. Therefore, thorough strategic planning is required to realise a new grid infrastructure that meets the electricity needs of the next 50-70 years. The new grid architecture is supposed to enable the integration of all renewable energy sources – independently from where and when they are generated – to expand the possibility for distributed generation and demand-side management.

Grid expansion is inevitable but often controversial. The transmission system operators (TSOs) need to accommodate not only the 2020 targets but also to prepare for the more challenging full decarbonisation of the power sector by 2050. The non-governmental organisations (NGO Global Network) community is still not united with respect to supporting or opposing the grid expansion. A number of technical, environmental and health questions need to be addressed and clarified to improve a shared understanding among and across TSOs and NGOs. RGI is trying to bring together cooperating TSOs and NGOs.

The grid expansion could be accomplished by means of overhead lines and underground cables. Both of them may transmit alternative current (AC) and direct current (DC). In the past it was relatively easy to select between lines and cables:

Cables mainly used in the grid for shorter distances mostly due to being more expensive and shorter technical lifetime (50% of overhead lines) whereas overhead lines were used in another cases. Nowadays the situation is more complex since more options and more parameters should be considered. In the future cables will prospectively be even more utilised as development is going towards higher power levels.

Cables have higher public acceptance because of their lower disturbance of natural scenery, lower electromagnetic radiation, avoidance of wildlife, higher weather tolerance. The overhead lines unfortunately disturb the scenery and seriously influence wildlife and protected areas.

The grid development for expanding the renewables by means of overhead lines endangers bird populations inEurope. High and large-scale bird mortality from aboveground power lines progresses due to:

  • Risk of electrocution,
  • Risk of collision,
  • Negative impacts on habitats.

And that all makes up a significant threat to birds and other wildlife. For these reasons Standards to protect birds (Habitats and Birds Directives) are being worked out. 

Moreover, the European Commission is currently working on a new legislation to ensure that the energy infrastructure needed for implementing the EU climate and energy targets will be built in time.

References


Intermittence of renewables

June 30, 2011
8 Comments

Composed by Galina Vitkova

Everybody knows that renewables are expensive, sometimes very expensive and make electricity price go up. For example, in the Czech Republic the expansion of building solar photovoltaic installations, donated from the state budget, caused increasing electricity price over 12 %. Another example of increasing the costs is given in the table below.

Increase in system operation costs (Euros per MW·h) for 10% and 20% wind share[7]

 

Germany

Denmark

Finland

Norway

Sweden

10%

2.5

0.4

0.3

0.1

0.3

20%

3.2

0.8

1.5

0.3

0.7

Nevertheless, only few people are aware of great intermittence of renewables, which excludes their usage as a main source of electricity generation not only nowadays, but in the future too. Actually no technical and industrial society can exist and develop using unreliable and intermittent power supplies. Nothing in our integrated and automated world works without electricity, this life-blood of technical civilisation. Just imagine what would happen to a society where electricity supply is turned off only for a short time, possibly every week, or if the power is cut for a whole fortnight or more. Life stops, production ceases, chaos sets in. And this is exactly what could arise if we bank on renewables. Thus let us take notice of features specific for wind and solar (photovoltaic) power installations, which are typically built in Europe. 

A straight line projection from where we are t...

Image via Wikipedia

The entire problem with renewables is that they are perilously intermittent power sources. The electricity produced using them is not harmonized with the electrical demand cycle. Renewable based installations generate electricity when the wind blows or the sun shines. Since the energy produced earlier in the day cannot be stored extra generating capacity will have to be brought on-line to cover the deficiency. This means that for every renewable based system installed, a conventional power station will have to be either built or retained to ensure continuity of energy supply. But this power station will have to be up and running all the time (i.e. to be a ’spinning-reserve’) because it takes up to 12 hours to put a power station on-line from a cold start-up. Thusly if we want to keep up continuity of supply the renewable sources result in twice the cost and save very little of fossil fuels.

Wind power is extremely variable. Building thousands of wind turbines still does not resolve the fundamental problem of the enormous wind variability. When days without significant winds occur, it doesn’t matter how many wind turbines are installed as they all go off-line. So, it is extremely difficult to integrate wind power stations into a normal generating grid.  

Solar energy is not available at night and cloudy days, which makes energy storage the most important issue in providing the continuous availability of energy. Off-grid photovoltaic systems traditionally use rechargeable batteries to store excess electricity. With grid-tied systems excess electricity can be sent to the transmission grid and later be settled.

Renewable energy supporters declare that renewable power can somehow be stored to cope with power outages. The first of these energy storage facilities, which comes to aid the thousands of wind-turbines motionless when winds do not blow and solar installations without generating when the sun does not shine, is the pumped water storage system. However, this claim is not well-founded for the following reasons:

  • In most countries of Europe pumped storage systems are already fully used for overpowering variability in electrical demand, and so as a rule they have no extra capacity for overcoming variability in supply due to the unreliable wind and solar generation systems.
  • Pumped storage systems have limited capacity, which can be used for electricity generating  for just a few hours, while wind or solar generation systems can go off-line for days or weeks at a time.
  • Pumped storage systems are not only hugely expensive to construct, the topography of european countries ensure that very few sites are available.

As for flywheel energy storage, compressed air storage, battery storage and hydrogen storage each of these systems is highly complicated, very expensive, hugely inefficient and limited in capacity. The hydrogen storage is especially popular and hyped among proponents of renewables. The hydrogen, produced and stored when renewables generate more electricity than it could be used, is supposed to propel vehicles and generators. Unfortunately these hydrogen powered vehicles and generators are only about 5% efficient. In addition, hydrogen storage vessels are highly flammable and potentially explosive. Practically nowadays there is no energy system available that can remotely be expected to replace renewable energy resources in a large scale, while they are out of functioning.

In numerous publications about renewables we are chiefly informed about expanding and increasing investments in renewables, multiplying their installed capacity and volumes of produced electricity, everything in absolute values, without comparing these indicators with values of other resources, especially when they speak about volumes of production. In the table below you find comparable values of volumes electricity produced by nuclear power plants and renewable installations. Look it through and have your own opinion of the problem.

Comparison of nuclear and renewable electricity producing by top nuclear electricity producers (TW·h-year/% of total electricity production in the country)

 

Country

Year

Nuclear  2007

Wind Power

Solar Power

1 USA 2009

837/19.4%

70.8/1.64%

0.808/0.019%

2 Japan 2008

264/23.5%

1.754/0.156%

0.002/0.000%

3 Russia 2008

160/15.8%

0.007/0.0007%

 

4 Germany 2010

141/22.3%

36.5/5.499%

12.0/1.898%

5 Canada 2008

93/14.6%

2.5/0.392%

0.017/0.003%

Conclusion: Common people must know and must interest about situation in producing and supplying electricity. Only then they will be able to enforce on the governments to make rightdecisions in order to ensure stable supplying electricity, without which modern civilisation cannot exist and improve.

 References:


Study in Ireland

June 12, 2011
Leave a Comment
Dear friends of Technical English,
Here below you find a description of how my former student sees his experience with studying in Ireland. Nowadays there are many opportunities for studying and teaching everywhere across Europe. Learn Technical English and you can get staying at some Europe´s technical university.  Galina Vitkova
 
All Ireland Flag

All Ireland Flag

My study in Ireland

By David Jirovec

I spent 8.5 months (both winter and summer semesters) in Ireland within the EU programme Erasmus. In Cork, Ireland‘s second biggest city, I was studying computer science, the same subject as at the Czech Technical University (CTU) in Prague About studying in Ireland, namely at the Cork Institute of Technology(CIT), it is rather similar to studying at a high school in Bohemia. A student attends his/her class of about 20 participants and these people study nearly all courses together. We were recommended to choose one of these classes and join it. But since I am in my final year at CTU, I couldn’t find any class with suitable combination of courses. So finally, I took each course with a different class. 

Cork City Marathon 2011

Cork City Marathon 2011

These small classes are set for both lectures and labs, so there are no extended lectures for 200 participants as at CTU. Students are never asked to go to and show something at the blackboard to whole class, results of any student’s tests are never shown to other students.

Exams are carried out only in a written form. They take place in very big halls, where students from different courses are present at the same time. Very strict security measures are held there, students cannot take any bags with themselves, it is forbidden to have even a mobile phone there. Exams are easier than at CTU, sometimes it is like choosing 3 questions out of total 5 and answering them, instead of solving all questions. Worse is that there are no 3 free exam attempts as at CTU. If a student fails once, it is possible to try again in the summer, but it costs some euros. There is no a given minimum of points for any test, it is only  necessary to have a sum of at least 40/100 points at the end of a semester for both in semester work and exams. And no compulsory attendance at any classes is required.

Seat of the Rectorate of the Czech Technical U...

Seat of the Rectorate of CTU in Prague

 
Relationships between students and teachers are very good, teachers are friendly and helpful. I had no problems with my English in classes, teachers were easy to understand, but sometimes it was more difficult to understand the students, especially when they were talking to each other. I don’t see much improvement in my English grammar, but my communication skills in English improved much. It was definitely very profitable to use English for all day-to-day tasks and conversation, and observe the little differences between English commonly used in Ireland and English language taught at school in Prague. Irish people speak English, which mostly is just a slang language. So, I recommend anybody who is going to visit Ireland to apply http://www.urbandictionary.com/define.php?term=what%27s+the+craic%3F in order to understand phrases brought about by Celtic community dialects.

PS The ERASMUS Programme – studying in Europe and more – is the EU’s flagship education and training programme enabling 200 000 students to study and work abroad each year. In addition, it funds co-operation between higher education institutions across Europe. The programme not only supports students, but also professors and business staff who want to teach abroad, as well as helping university staff to receive training. European Commission , Education & Training (http://ec.europa.eu/education/lifelong-learning-programme/doc80_en.htm)


Fusion reactors in the world

May 10, 2011
Leave a Comment
Implosion of a fusion microcapsule on the NOVA...

Implosion of a fusion microcapsule

Composed by Galina Vitkova

Fusion power is power generated by nuclear fusion processes. In fusion reactions two light atomic nuclei fuse together to form a heavier nucleus. During the process a comparatively large amount of energy is released.

The term “fusion power” is commonly used to refer to potential commercial production of usable power from a fusion source, comparable to the usage of the term “steam power”. Heat from the fusion reactions is utilized to operate a steam turbine which in turn drives electrical generators, similar to the process used in fossil fuel and nuclear fission power stations.

Fusion power has significant safety advantages in comparison with current power stations based on nuclear fission. Fusion only takes place under very limited and controlled conditions So, a failure of precise control or pause of fueling quickly shuts down fusion power reactions. There is no possibility of runaway heat build-up or large-scale release of radioactivity, little or no atmospheric pollution. Furthermore, the power source comprises light elements in small quantities, which are easily obtained and largely harmless to life, the waste products are short-lived in terms of radioactivity. Finally, there is little overlap with nuclear weapons technology.

 

Fusion Power Grid

Fusion Power Grid

 

Fusion powered electricity generation was initially believed to be readily achievable, as fission power had been. However, the extreme requirements for continuous reactions and plasma containment led to projections which were extended by several decades. More than 60 years after the first attempts, commercial fusion power production is still believed to be unlikely before 2040.

The leading designs for controlled fusion research use magnetic (tokamak design) or inertial (laser) confinement of a plasma.

Magnetic confinement of a plasma

The tokamak (see also Number 29 – Easy such and so / April 2011, Nuclear powertokamaks), using magnetic confinement of a plasma, dominates modern research. Very large projects like ITER (see also  The Project ITER – past and present) are expected to pass several important turning points toward commercial power production, including a burning plasma with long burn times, high power output, and online fueling. There are no guarantees that the project will be successful. Unfortunately, previous generations of tokamak machines have revealed new problems many times. But the entire field of high temperature plasmas is much better understood now than formerly. So, ITER is optimistically considered to meet its goals. If successful, ITER would be followed by a “commercial demonstrator” system. The system is supposed to be similar in purpose to the very earliest power-producing fission reactors built in the period before wide-scale commercial deployment of larger machines started in the 1960s and 1970s.

Ultrascale scientific computing, combined with...

Ultrascale scientific computing

 

Stellarators, which also use magnetic confinement of a plasma, are the earliest controlled fusion devices. The stellator was invented by Lyman Spitzer in 1950 and built the next year at what later became the Princeton Plasma Physics Laboratory. The name “stellarator” originates from the possibility of harnessing the power source of the sun, a stellar object.

Stellarators were popular in the 1950s and 60s, but the much better results from tokamak designs led to their falling from favor in the 1970s. More recently, in the 1990s, problems with the tokamak concept have led to renewed interest in the stellarator design, and a number of new devices have been built. Some important modern stellarator experiments are Wendelstein, in Germany, and the Large Helical Device, inJapan.

Inertial confinement fusion

Inertial confinement fusion (ICF) is a process where nuclear fusion reactions are initiated by heating and compressing a fuel target, typically in the form of a pellet. The pellets most often contain a mixture of deuterium and tritium.

Inertial confinement fusion

Inertial confinement fusion

 

To compress and heat the fuel, energy is delivered to the outer layer of the target using high-energy beams of laser light, electrons or ions, although for a variety of reasons, almost all ICF devices to date have used lasers. The aim of ICF is to produce a state known as “ignition”, where this heating process causes a chain reaction that burns a significant portion of the fuel. Typical fuel pellets are about the size of a pinhead and contain around 10 milligrams of fuel. In practice, only a small proportion of this fuel will undergo fusion, but if all this fuel were consumed it would release the energy equivalent to burning a barrel of oil.

To date most of the work in ICF has been carried out in Franceand the United States, and generally has seen less development effort than magnetic approaches. Two large projects are currently underway, the Laser Mégajoule in France and the National Ignition Facility in theUnited States.

All functioning fusion reactors are listed in eFusion experimental devices classified by a confinement method.

 Reference: Wikipedia, the free encyclopedia http://en.wikipedia


The Project ITER – past and present

April 30, 2011
Leave a Comment

Composed by Galina Vitkova

 

The logo of the ITER Organization

The logo of the ITER Organization

 

„We firmly believe that to harness fusion energy is the only way to reconcile huge conflicting demands which will confront humanity sooner or later“

Director-General Osamu Motojima,  Opening address, Monaco International ITER Fusion Energy Days, 23 November 2010

 

ITER was originally an acronym for International Thermonuclear Experimental Reactor, but that title was dropped in view of the negatively popular connotation of “thermonuclear“, especially in conjunction with “experimental”. “Iter” also means “journey”, “direction” or “way” in Latin, taking into consideration ITER potential role in harnessing nuclear fusion (see also The ViCTE Newsletter Number 28 – SVOMT revising/March 2011 Nuclear power – fission and fusion) as a peaceful power source.

ITER is a large-scale scientific project intended to prove the practicability of fusion as an energy source, to prove that it can work without negative impact. Moreover, it is expected to collect the data necessary for the design and subsequent operation of the first electricity-producing fusion power plant. Besides, it aims to demonstrate the possibility to produce commercial energy from fusion. ITER is the culmination of decades of fusion research: more than 200 tokamaks (see also The ViCTE Newsletter Number 29 – Easy such and so / April 2011 Nuclear power – tokamaks) built over the world have paved the way to the ITER experiment. ITER is the result of the knowledge and experience these machines have accumulated. ITER, which will be twice the size of the largest tokamak currently operating, is conceived as the necessary experimental step on the way to a demonstration of a fusion power plant potential.

The scientific goal of the ITER project is to deliver ten times the power it consumes. From 50 MW of input power, the ITER machine is designed to produce 500 MW of fusion power – the first of all fusion experiments producing net energy. During its operational lifetime, ITER will test key technologies necessary for the next step, will develop technologies and processes needed for a fusion power plant – including superconducting magnets and remote handling (maintenance by robot). Furthermore, it will verify tritium breeding concepts, will refine neutron shield/heat conversion technology. As a result the ITER project will demonstrate that a fusion power plant is able to capture fusion energy for commercial use.

Launched as an idea for international collaboration in 1985, now the ITER Agreement includes China, the European Union, India, Japan, Korea, Russia and the United States, representing over half of the world’s population. Twenty years of the design work and complex negotiations have been necessary to bring the project to where it is today.

The ITER Agreement was officially signed at theElyséePalaceinParison21 November 2006by Ministers from the seven ITER Members. In a ceremony hosted by French President Jacques Chirac and the President of the European Commission M. José Manuel Durao Barroso, this Agreement established a legal international entity to be responsible for construction, operation, and decommissioning of ITER.

On24 October 2007, after ratification by all Members, the ITER Agreement entered into force and officially established the ITER Organization. ITER was originally expected to cost approximately €5billion. However, the rising price of raw materials and changes to the initial design have augmented that amount more than triple, i.e. to €16billion.

Cost Breakdown of ITER Reactor

Cost Breakdown of ITER Reactor

 

The program is anticipated to last for 30 years – 10 for construction, and 20 of operation. The reactor is expected to take 10 years to build with completion in 2018. The ITER site in Cadarache, France stands ready: in 2010, construction began on the ITER Tokamak and scientific buildings. The seven ITER Members have shared in the design of the installation, the creation of the international project structure, and in its funding.

Key components for the Tokamak will be manufactured in the seven Member States and shipped to Franceby sea. From the port in Berre l’Etang on the Mediterranean, the components will be transported by special convoy along the 104 kilometres of the ITER Itinerary to Cadarache. The exceptional size and weight of certain of the Tokamak components made large-scale public works necessary to widen roads, reinforce bridges and modify intersections. Costs were shared by the Bouches-du-Rhône department Council (79%) and theFrenchState (21%). Work on the Itinerary was completed in December, 2010.

Two trial convoys will be organized in 2011 to put the Itinerary’s resistance and design to the test before a full-scale practice convoy in 2012, and the arrival of the first components for ITER by sea.

Between 2012 and 2017, 200 exceptional convoys will travel by night at reduced speeds along the ITER Itinerary, bypassing 16 villages, negotiating 16 roundabouts, and crossing 35 bridges.

Manufacturing of components for ITER has already begun in Members industries all over the world. So, the level of coordination required for the successful fabrication of over one million parts for the ITER Tokamak alone is daily creating a new model of international scientific collaboration.

ITER, without question, is a very complex project. Building ITER will require a continuous and joint effort involving all partners. In any case, this project remains a challenging task and for most of participants it is a once-in-a-lifetime opportunity to contribute to such a fantastic endeavour.

 

References:


Nuclear energy future after Fukushima

March 23, 2011
11 Comments
Composed by Galina Vitkova

What the damage to the Fukushima plant (see picture below) forecasts for Japan—and the world? But first, let us introduce general description of nuclear power stations in order to sense problems caused by the breakdown. 

 

The Fukushima 1 NPP

Image via Wikipedia

 Nuclear fission. Nowadays nuclear power stations generate energy using nuclear fission (Fukushima belongs to this type of nuclear power plants). Atoms of uranium (235) rods in the reactor are split in the process of fission and cause a chain reaction with other nuclei. During this process a large amount of energy is released. The energy heats water to create steam, which rotates a turbine together with a generator, producing electricity.

Depending on the type of fission, presumptions for ensuring supply of the fuel at existing level varies from several decades for the Uranium-235 to thousands of years for uranium-238. At the present rate of use, uranium-235 reserves (as of 2007) will be exhausted in about 70 years. The nuclear industry persuades that the cost of fuel makes a minor cost component for fission power. In future, mining of uranium sources could be more expensive, more difficult. However, increasing the price of uranium would have little brought about the overall cost of nuclear power. For instance, a doubling in the cost of natural uranium would increase the total cost of nuclear power by 5 percent. On the other hand, double increasing of natural gas price results in 60 percent growth of the cost of gas-fired power.

The possibility of nuclear meltdowns and other reactor accidents, such as the Three Mile Island accident and the Chernobyl disaster, have caused much public concern. Nevertheless, coal and hydro- power stations have both accompanied by more deaths per energy unit produced than nuclear power generation.

At present, nuclear energy is in decline, according to a 2007 World Nuclear Industry Status Report presented in the European Parliament. The report outlines that the share of nuclear energy in power production decreased in 21 out of 31 countries, with five fewer functioning nuclear reactors than five years ago. Currently 32 nuclear power plants are under construction or in the pipeline, 20 fewer than at the end of the 1990s.

Fusion. Fusion power could solve many of fission power problems. Nevertheless, despite research started in the 1950s, no commercial fusion reactor is expected before 2050. Many technical problems remain unsolved. Proposed fusion reactors commonly use deuterium and lithium as fuel.  Under assumption that a fusion energy output will be kept in the future, then the known lithium reserves would endure 3000 years, lithium from sea water would endure 60 million years. A more complicated fusion process using only deuterium from sea water would have fuel for 150 billion years.

Due to a joint effort of the European Union (EU), America, China, India, Japan, Russia and South Korea a prototype reactor is being constructed on a site in Cadarache (in France). It is supposed to be put into operation by 2018.

Initial projections in 2006 put its price at €10 billion ($13 billion): €5 billion to build and another €5 billion to run and decommission the thing. Since then construction costs alone have tripled.

As the host, the EU is committed to covering 45% of these, with the other partners contributing about 9% each. In May 2010 the European Commission asked member states to conduce an additional €1.4 billion to cope with the project over to 2013. Member states rejected the request.

Sustainability: The environmental movement emphasizes sustainability of energy use and development. “Sustainability” also refers to the ability of the environment to cope with waste products, especially air pollution.

The long-term radioactive waste storage problems of nuclear power have not been fully solved till now. Several countries use underground repositories. Needless to add nuclear waste takes up little space compared to wastes from the chemical industry which remains toxic indefinitely.

Future of nuclear industry. Let us return to how the damage to the Fukushima plant affects future usage of nuclear power in the future in Japan – and in the world.

Share of nuclear electricity production in total domestic production

Nowadays nuclear plants provide about a third of Japan’s electricity (see chart), Fukushima is not the first to be paralysed by an earthquake. But it is the first to be stricken by the technology dependence on a supply of water for cooling.

The 40-year-old reactors in Fukushima run by the Tokyo Electric Power Company faced a disaster beyond anything their designers were required to imagine.

What of the rest of the world? Nuclear industry supporters had hopes of a nuclear renaissance as countries try to reduce carbon emissions. A boom like that of the 1970s is talked, when 25 or so plants started construction each year in rich countries. Public opinion will surely take a dive. At the least, it will be difficult to find the political will or the money to modernise the West ageing reactors, though without modernisation they will not become safer. The heartless images from Fukushima, and the sense of lurching misfortune, will not be forgotten even if final figures unveil little damage to health. France, which has 58 nuclear reactors, seems to see the disaster in Japan as an opportunity rather than an obstacle for its nuclear industry. On March 14th President Nicolas Sarkozy said that French-built reactors have lost international tenders because they are expensive: “but they are more expensive because they are safer.”

However, the region where nuclear power should grow fastest, and seems to be deterred, is the rest of Asia. Two-thirds of the 62 plants under construction in the world are in Asia. Russia plans another ten. By far the most important arising nuclear power is China, which has 13 working reactors and 27 more on the way. China has announced a pause in nuclear commissioning, and a review. But its leaders know that they must go away from coal: the damage to health from a year of Chinese coal-burning plants is bigger then from nuclear industry. And if anyone can build cheap nuclear plants, it is probably the Chinese.

In case the West turns its back on nuclear power and China holds on, the results could be unfortunate. Nuclear plants need trustworthy and transparent regulation.

  References

  • The risks exposed: What the damage to the Fukushima plant portends for Japan—and the world; The Economist, March 19th 2011
  • Expensive Iteration: A huge international fusion-reactor project faces funding difficulties; The Economist, July 22nd 2010  

 

 


Next Page »

    April 2017
    M T W T F S S
    « Jul    
     12
    3456789
    10111213141516
    17181920212223
    24252627282930

    Blog Stats

    • 202,919 hits

    Subscribe with BlogLines

    Translatorsbase

    Dynamic blog-up

    technorati

    Join the discussion about

    Seomoz

    I <3 SEO moz