Why Technical English

Website – basic information

November 28, 2011
7 Comments

Website and Its Characteristics

                                                                                             Composed by Galina Vitkova using Wikipedia

A website (or web site) is a collection of web pages, typically common to a particular domain name on the Internet. A web page is a document usually written in HTML (Hyper Text Markup Language), which is almost always accessible via HTTP (Hyper-Text Transport Protocol). HTTP is a protocol that transfers information from the website server to display it in the user’s web browser. All publicly accessible web sites constitute the immense World Wide Web of information. More formally a web site might be considered a collection of pages dedicated to a similar or identical subject or purpose and hosted through a single domain.

The pages of a website are approached from a common root URL (Uniform Resource Locator or Universal Resource Locator) called the homepage, and usually reside on the same physical server. The URLs of the pages organise them into a hierarchy. Nonetheless, the hyperlinks between web pages regulate how the reader perceives the overall structure and how the traffic flows between the different parts of the sites. The first on-line website appeared in 1991 in CERN (European Organization for Nuclear Research situated in the suburbs of Geneva on the Franco–Swiss border) – for more information see ViCTE Newsletter Number 5 – WWW History (Part1) / May 2009, Number 6 – WWW History (Part2) / June 2009.

A website may belong to an individual, a business or other organization. Any website can contain hyperlinks to any other web site, so the differentiation one particular site from another may sometimes be difficult for the user.

Websites are commonly written in, or dynamically converted to, HTML and are accessed using a web browser. Websites can be approached from a number of computer based and Internet enabled devices, including desktop computers, laptops, PDAs (personal digital assistant or personal data assistant) and cell phones.

Website Drafts and Notes

Image by Jayel Aheram via Flickr

A website is hosted on a computer system called a web server or an HTTP server. These terms also refer to the software that runs on the servers and that retrieves and delivers the web pages in response to users´ requests.

Static and dynamic websites are distinguished. A static website is one that has content which is not expected to change frequently and is manually maintained by a person or persons via editor software. It provides the same available standard information to all visitors for a certain period of time between updating of the site.

A dynamic website is one that has frequently changing information or interacts with the user from various situation (HTTP cookies or database variables e.g., previous history, session variables, server side variables, etc.) or direct interaction (form elements, mouseovers, etc.). When the web server receives a request for a given page, the page is automatically retrieved from storage by the software. A site can display the current state of a dialogue between users, can monitor a changing situation, or provide information adapted in some way for the particular user.

Static content may also be dynamically generated either periodically or if certain conditions for regeneration occur in order to avoid the performance loss of initiating the dynamic engine

Website Designer & SEO Company Lexington Devel...
Image by temptrhonda via Flickr

Some websites demand a subscription to access some or all of their content. Examples of subscription websites include numerous business sites, parts of news websites, academic journal websites, gaming websites, social networking sites, websites affording real-time stock market data, websites providing various services (e.g., websites offering storing and/or sharing of images, files, etc.) and many others.

For showing active content of sites or even creating rich internet applications plagins such as Microsoft Silverlight, Adobe Flash, Adobe Shockwave or applets are used. They provide interactivity for the user and real-time updating within web pages (i.e. pages don’t have to be loaded or reloaded to effect any changes), mainly applying the DOM (Document Object Model) and JavaScript.

There are many varieties of websites, each specialising in a particular type of content or use, and they may be arbitrarily classified in any number of ways. A few such classifications might include: Affiliate, Archive site, Corporate website, Commerce site, Directory site and many many others (see a detailed classification in Types of websites).

In February 2009, an Internet monitoring company Netcraft, which has tracked web growth since 1995, reported that there were 106,875,138 websites in 2007 and 215,675,903 websites in 2009 with domain names and content on them, compared to just 18,000 Web sites in August 1995.

 PS:  Spellingwhat is the better, what is correct: “website OR “web site?

The form “website” has gradually become the standard spelling. It is used, for instance, by such leading dictionaries and encyclopedias as the Canadian Oxford Dictionary, the Oxford English Dictionary, Wikipedia. Nevertheless, a form “web site” is still widely used, e.g. Encyclopædia Britannica (including its Merriam-Webster subsidiary). Among major Internet technology companies, Microsoft uses “website” and occasionally “web site”, Apple uses “website”, and Google uses “website”, too.

 PSS: Unknown technical terms you can find in the Internet English Vocabulary.

 Reference      Website – Wikipedia, the free encyclopedia

Have You Donated To Wikipedia Already?

Do you use Wikipedia? Do you know that Jimmy Wales, a foundator of Wikipedia, decided to keep Wikipedia advertising free and unbiased. So, they have financial problems with surviving now. Any donation, even a small sum is helpful. Thus, here’s the page where you can donate.

Dear visitor,  If you want to improve your professional English and at the same time to gain basic comprehensive targetted information about the Internet and Web, subscribe to “Why Technical English”.

Look at the right sidebar and subscribe as you like:

  • by Email subsription … Sign me up        
  • Subsribe with Bloglines
  • Subsribe.ru

Right now within preparing the e-book “Internet English” (see ViCTE Newsletter Number 33 – WWW, Part 1 / August 2011 ) posts on this topic are being published there. Your comments to the posts are welcome.

 Related articles

 

Advertisements

The Semantic Web – great expectations

October 31, 2011
3 Comments

By Galina Vitkova

The Semantic Web brings the further development of the World Wide Web aimed at interpreting the content of the web pages as machine-readable information.

In the classical Web based on HTML web pages the information is comprised in the text or documents which are read and composed into visible or audible for humans web pages by a browser. The Semantic Web is supposed to store information as a semantic network through the use of ontologies. The semantic network is usually a directed or undirected graph consisting of vertices, which represent concepts, and edges, which represent relations among the concepts.  An ontology is simply a vocabulary that describes objects and how they relate to one another. So a program-agent is able to mine facts immediately from the Semantic Web and draw logical conclusions based on them. The Semantic Web functions together with the existing Web and uses the protocol HTTP and resource identificators URIs.

The term  Semantic Web was coined by sir Tim Berners-Lee, the inventor of the World Wide Web and director of the World Wide Web Consortium (W3C) in May 2001 in the journal «Scientific American». Tim Berners-Lee considers the Semantic Web the next step in the developing of the World Wide Web. W3C has adopted and promoted this concept.

Main idea

The Semantic Web is simply a hyper-structure above the existing Web. It extends the network of hyperlinked human-readable web pages by inserting machine-readable metadata about pages and how they are related to each other. It is proposed to help computers “read” and use the Web in a more sophisticated way. Metadata can allow more complex, focused Web searches with more accurate results. To paraphrase Tim Berners-Lee the extension will let the Web – currently similar to a giant book – become a giant database. Machine processing of the information in the Semantic Web is enabled by two the most important features of it.

  • First – The all-around application of uniform resource identifiers (URIs), which are known as addresses. Traditionally in the Internet these identifiers are used for pointing hyperlinks to an addressed object (web pages, or e-mail addresses, etc.). In the Semantic Web the URIs are used also for specifying resources, i.e. URI identifies exactly an object. Moreover, in the Semantic Web not only web pages or their parts have URI, but objects of the real world may have URI too (e.g. humans, towns, novel titles, etc.). Furthermore, the abstract resource attribute (e.g. name, position, colour) have their own URI. As the URIs are globally unique they enable to identify the same objects in different places in the Web. Concurrently, URIs of the HTTP protocol (i.e. addresses beginning with http://) can be used as addresses of documents that contain a machine-readable description of these objects.

  • Second – Application of semantic networks and ontologies. Present-day methods of automatic processing information in the Internet are as a rule based on the frequency and lexical analysis or parsing of the text, so it is designated for human perception. In the Semantic Web instead of that the RDF (Resource Description Framework) standard is applied, which uses semantic networks (i.e. graphs, whose vertices and edges have URIs) for representing the information. Statements coded by means of RDF can be further interpreted by ontologies created in compliance with the standards of RDF Schema and OWL (Web Ontology Language) in order to draw logical conclusions. Ontologies are built using so called description logics. Ontologies and schemata help a computer to understand human vocabulary.

 

Semantic Web Technologies

The architecture of the Semantic Web can be represented by the Semantic Web Stack also known as Semantic Web Cake or Semantic Web Layer Cake. The Semantic Web Stack is an illustration of the hierarchy of languages, where each layer exploits and uses capabilities of the layers below. It shows how technologies, which are standardized for the Semantic Web, are organized to make the Semantic Web possible. It also shows how Semantic Web is an extension (not replacement) of the classical hypertext Web. The illustration was created by Tim Berners-Lee. The stack is still evolving as the layers are concretized.

Semantic Web Stack

As shown in the Semantic Web Stack, the following languages or technologies are used to create the Semantic Web. The technologies from the bottom of the stack up to OWL (Web Ontology Langure) are currently standardized and accepted to build Semantic Web applications. It is still not clear how the top of the stack is going to be implemented. All layers of the stack need to be implemented to achieve full visions of the Semantic Web.

  • XML (eXtensible Markup Language) is a set of rules for encoding documents in machine-readable form. It is a markup language like HTML. XML complements (but does not replace) HTML by adding tags that describe data.
  • XML Schema published as a W3C recommendation in May 2001 is one of several XML schema languages. It can be used to express a set of rules to which an XML document must conform in order to be considered ‘valid’.
  • RDF (Resource Description Framework) is a family of W3C specifications originally designed as a metadata data model. It has come to be used as a general method for conceptual description of information that is implemented in web resources. RDF does exactly what its name indicates: using XML tags, it provides a framework to describe resources. In RDF terms, everything in the world is a resource. This framework pairs the resource with a specific location in the Web, so the computer knows exactly what the resource is. To do this, RDF uses triples written as XML tags to express this information as a graph. These triples consist of a subject, property and object, which are like the subject, verb and direct object of an English sentence.
  • RDFS (Vocabulary Description Language Schema) provides basic vocabulary for RDF, adds classes, subclasses and properties to resources, creating a basic language framework
  • OWL (Web Ontology Language) is a family of knowledge representation languages for creating ontologies. It extends RDFS being the most complex layer, formalizes ontologies, describes relationships between classes and uses logic to make deductions.
  • SPARQL (Simple Protocol and RDF Query Language) is a RDF query language, which can be used to query any RDF-based data. It enables to retrieve information for semantic web applications.
  • Microdata (HTML)  is an international standard that is applied to nest semantics within existing content on web pages. Search engines, web crawlers, and browsers can extract and process Microdata from a web page providing better search results

As mentioned, top layers contain technologies that are not yet standardized or comprise just ideas. May be, the layers Cryptography and Trust are the most uncommon of them. Thus Cryptography ensures and verifies the origin of web statements from a trusted source by a digital signature of RDF statements. Trust to derived statements means that the premises come from the trusted source and that formal logic during deriving new information is reliable.


Study in Ireland

June 12, 2011
Leave a Comment
Dear friends of Technical English,
Here below you find a description of how my former student sees his experience with studying in Ireland. Nowadays there are many opportunities for studying and teaching everywhere across Europe. Learn Technical English and you can get staying at some Europe´s technical university.  Galina Vitkova
 
All Ireland Flag

All Ireland Flag

My study in Ireland

By David Jirovec

I spent 8.5 months (both winter and summer semesters) in Ireland within the EU programme Erasmus. In Cork, Ireland‘s second biggest city, I was studying computer science, the same subject as at the Czech Technical University (CTU) in Prague About studying in Ireland, namely at the Cork Institute of Technology(CIT), it is rather similar to studying at a high school in Bohemia. A student attends his/her class of about 20 participants and these people study nearly all courses together. We were recommended to choose one of these classes and join it. But since I am in my final year at CTU, I couldn’t find any class with suitable combination of courses. So finally, I took each course with a different class. 

Cork City Marathon 2011

Cork City Marathon 2011

These small classes are set for both lectures and labs, so there are no extended lectures for 200 participants as at CTU. Students are never asked to go to and show something at the blackboard to whole class, results of any student’s tests are never shown to other students.

Exams are carried out only in a written form. They take place in very big halls, where students from different courses are present at the same time. Very strict security measures are held there, students cannot take any bags with themselves, it is forbidden to have even a mobile phone there. Exams are easier than at CTU, sometimes it is like choosing 3 questions out of total 5 and answering them, instead of solving all questions. Worse is that there are no 3 free exam attempts as at CTU. If a student fails once, it is possible to try again in the summer, but it costs some euros. There is no a given minimum of points for any test, it is only  necessary to have a sum of at least 40/100 points at the end of a semester for both in semester work and exams. And no compulsory attendance at any classes is required.

Seat of the Rectorate of the Czech Technical U...

Seat of the Rectorate of CTU in Prague

 
Relationships between students and teachers are very good, teachers are friendly and helpful. I had no problems with my English in classes, teachers were easy to understand, but sometimes it was more difficult to understand the students, especially when they were talking to each other. I don’t see much improvement in my English grammar, but my communication skills in English improved much. It was definitely very profitable to use English for all day-to-day tasks and conversation, and observe the little differences between English commonly used in Ireland and English language taught at school in Prague. Irish people speak English, which mostly is just a slang language. So, I recommend anybody who is going to visit Ireland to apply http://www.urbandictionary.com/define.php?term=what%27s+the+craic%3F in order to understand phrases brought about by Celtic community dialects.

PS The ERASMUS Programme – studying in Europe and more – is the EU’s flagship education and training programme enabling 200 000 students to study and work abroad each year. In addition, it funds co-operation between higher education institutions across Europe. The programme not only supports students, but also professors and business staff who want to teach abroad, as well as helping university staff to receive training. European Commission , Education & Training (http://ec.europa.eu/education/lifelong-learning-programme/doc80_en.htm)


Fusion reactors in the world

May 10, 2011
Leave a Comment
Implosion of a fusion microcapsule on the NOVA...

Implosion of a fusion microcapsule

Composed by Galina Vitkova

Fusion power is power generated by nuclear fusion processes. In fusion reactions two light atomic nuclei fuse together to form a heavier nucleus. During the process a comparatively large amount of energy is released.

The term “fusion power” is commonly used to refer to potential commercial production of usable power from a fusion source, comparable to the usage of the term “steam power”. Heat from the fusion reactions is utilized to operate a steam turbine which in turn drives electrical generators, similar to the process used in fossil fuel and nuclear fission power stations.

Fusion power has significant safety advantages in comparison with current power stations based on nuclear fission. Fusion only takes place under very limited and controlled conditions So, a failure of precise control or pause of fueling quickly shuts down fusion power reactions. There is no possibility of runaway heat build-up or large-scale release of radioactivity, little or no atmospheric pollution. Furthermore, the power source comprises light elements in small quantities, which are easily obtained and largely harmless to life, the waste products are short-lived in terms of radioactivity. Finally, there is little overlap with nuclear weapons technology.

 

Fusion Power Grid

Fusion Power Grid

 

Fusion powered electricity generation was initially believed to be readily achievable, as fission power had been. However, the extreme requirements for continuous reactions and plasma containment led to projections which were extended by several decades. More than 60 years after the first attempts, commercial fusion power production is still believed to be unlikely before 2040.

The leading designs for controlled fusion research use magnetic (tokamak design) or inertial (laser) confinement of a plasma.

Magnetic confinement of a plasma

The tokamak (see also Number 29 – Easy such and so / April 2011, Nuclear powertokamaks), using magnetic confinement of a plasma, dominates modern research. Very large projects like ITER (see also  The Project ITER – past and present) are expected to pass several important turning points toward commercial power production, including a burning plasma with long burn times, high power output, and online fueling. There are no guarantees that the project will be successful. Unfortunately, previous generations of tokamak machines have revealed new problems many times. But the entire field of high temperature plasmas is much better understood now than formerly. So, ITER is optimistically considered to meet its goals. If successful, ITER would be followed by a “commercial demonstrator” system. The system is supposed to be similar in purpose to the very earliest power-producing fission reactors built in the period before wide-scale commercial deployment of larger machines started in the 1960s and 1970s.

Ultrascale scientific computing, combined with...

Ultrascale scientific computing

 

Stellarators, which also use magnetic confinement of a plasma, are the earliest controlled fusion devices. The stellator was invented by Lyman Spitzer in 1950 and built the next year at what later became the Princeton Plasma Physics Laboratory. The name “stellarator” originates from the possibility of harnessing the power source of the sun, a stellar object.

Stellarators were popular in the 1950s and 60s, but the much better results from tokamak designs led to their falling from favor in the 1970s. More recently, in the 1990s, problems with the tokamak concept have led to renewed interest in the stellarator design, and a number of new devices have been built. Some important modern stellarator experiments are Wendelstein, in Germany, and the Large Helical Device, inJapan.

Inertial confinement fusion

Inertial confinement fusion (ICF) is a process where nuclear fusion reactions are initiated by heating and compressing a fuel target, typically in the form of a pellet. The pellets most often contain a mixture of deuterium and tritium.

Inertial confinement fusion

Inertial confinement fusion

 

To compress and heat the fuel, energy is delivered to the outer layer of the target using high-energy beams of laser light, electrons or ions, although for a variety of reasons, almost all ICF devices to date have used lasers. The aim of ICF is to produce a state known as “ignition”, where this heating process causes a chain reaction that burns a significant portion of the fuel. Typical fuel pellets are about the size of a pinhead and contain around 10 milligrams of fuel. In practice, only a small proportion of this fuel will undergo fusion, but if all this fuel were consumed it would release the energy equivalent to burning a barrel of oil.

To date most of the work in ICF has been carried out in Franceand the United States, and generally has seen less development effort than magnetic approaches. Two large projects are currently underway, the Laser Mégajoule in France and the National Ignition Facility in theUnited States.

All functioning fusion reactors are listed in eFusion experimental devices classified by a confinement method.

 Reference: Wikipedia, the free encyclopedia http://en.wikipedia


The Project ITER – past and present

April 30, 2011
Leave a Comment

Composed by Galina Vitkova

 

The logo of the ITER Organization

The logo of the ITER Organization

 

„We firmly believe that to harness fusion energy is the only way to reconcile huge conflicting demands which will confront humanity sooner or later“

Director-General Osamu Motojima,  Opening address, Monaco International ITER Fusion Energy Days, 23 November 2010

 

ITER was originally an acronym for International Thermonuclear Experimental Reactor, but that title was dropped in view of the negatively popular connotation of “thermonuclear“, especially in conjunction with “experimental”. “Iter” also means “journey”, “direction” or “way” in Latin, taking into consideration ITER potential role in harnessing nuclear fusion (see also The ViCTE Newsletter Number 28 – SVOMT revising/March 2011 Nuclear power – fission and fusion) as a peaceful power source.

ITER is a large-scale scientific project intended to prove the practicability of fusion as an energy source, to prove that it can work without negative impact. Moreover, it is expected to collect the data necessary for the design and subsequent operation of the first electricity-producing fusion power plant. Besides, it aims to demonstrate the possibility to produce commercial energy from fusion. ITER is the culmination of decades of fusion research: more than 200 tokamaks (see also The ViCTE Newsletter Number 29 – Easy such and so / April 2011 Nuclear power – tokamaks) built over the world have paved the way to the ITER experiment. ITER is the result of the knowledge and experience these machines have accumulated. ITER, which will be twice the size of the largest tokamak currently operating, is conceived as the necessary experimental step on the way to a demonstration of a fusion power plant potential.

The scientific goal of the ITER project is to deliver ten times the power it consumes. From 50 MW of input power, the ITER machine is designed to produce 500 MW of fusion power – the first of all fusion experiments producing net energy. During its operational lifetime, ITER will test key technologies necessary for the next step, will develop technologies and processes needed for a fusion power plant – including superconducting magnets and remote handling (maintenance by robot). Furthermore, it will verify tritium breeding concepts, will refine neutron shield/heat conversion technology. As a result the ITER project will demonstrate that a fusion power plant is able to capture fusion energy for commercial use.

Launched as an idea for international collaboration in 1985, now the ITER Agreement includes China, the European Union, India, Japan, Korea, Russia and the United States, representing over half of the world’s population. Twenty years of the design work and complex negotiations have been necessary to bring the project to where it is today.

The ITER Agreement was officially signed at theElyséePalaceinParison21 November 2006by Ministers from the seven ITER Members. In a ceremony hosted by French President Jacques Chirac and the President of the European Commission M. José Manuel Durao Barroso, this Agreement established a legal international entity to be responsible for construction, operation, and decommissioning of ITER.

On24 October 2007, after ratification by all Members, the ITER Agreement entered into force and officially established the ITER Organization. ITER was originally expected to cost approximately €5billion. However, the rising price of raw materials and changes to the initial design have augmented that amount more than triple, i.e. to €16billion.

Cost Breakdown of ITER Reactor

Cost Breakdown of ITER Reactor

 

The program is anticipated to last for 30 years – 10 for construction, and 20 of operation. The reactor is expected to take 10 years to build with completion in 2018. The ITER site in Cadarache, France stands ready: in 2010, construction began on the ITER Tokamak and scientific buildings. The seven ITER Members have shared in the design of the installation, the creation of the international project structure, and in its funding.

Key components for the Tokamak will be manufactured in the seven Member States and shipped to Franceby sea. From the port in Berre l’Etang on the Mediterranean, the components will be transported by special convoy along the 104 kilometres of the ITER Itinerary to Cadarache. The exceptional size and weight of certain of the Tokamak components made large-scale public works necessary to widen roads, reinforce bridges and modify intersections. Costs were shared by the Bouches-du-Rhône department Council (79%) and theFrenchState (21%). Work on the Itinerary was completed in December, 2010.

Two trial convoys will be organized in 2011 to put the Itinerary’s resistance and design to the test before a full-scale practice convoy in 2012, and the arrival of the first components for ITER by sea.

Between 2012 and 2017, 200 exceptional convoys will travel by night at reduced speeds along the ITER Itinerary, bypassing 16 villages, negotiating 16 roundabouts, and crossing 35 bridges.

Manufacturing of components for ITER has already begun in Members industries all over the world. So, the level of coordination required for the successful fabrication of over one million parts for the ITER Tokamak alone is daily creating a new model of international scientific collaboration.

ITER, without question, is a very complex project. Building ITER will require a continuous and joint effort involving all partners. In any case, this project remains a challenging task and for most of participants it is a once-in-a-lifetime opportunity to contribute to such a fantastic endeavour.

 

References:


Nuclear energy future after Fukushima

March 23, 2011
11 Comments
Composed by Galina Vitkova

What the damage to the Fukushima plant (see picture below) forecasts for Japan—and the world? But first, let us introduce general description of nuclear power stations in order to sense problems caused by the breakdown. 

 

The Fukushima 1 NPP

Image via Wikipedia

 Nuclear fission. Nowadays nuclear power stations generate energy using nuclear fission (Fukushima belongs to this type of nuclear power plants). Atoms of uranium (235) rods in the reactor are split in the process of fission and cause a chain reaction with other nuclei. During this process a large amount of energy is released. The energy heats water to create steam, which rotates a turbine together with a generator, producing electricity.

Depending on the type of fission, presumptions for ensuring supply of the fuel at existing level varies from several decades for the Uranium-235 to thousands of years for uranium-238. At the present rate of use, uranium-235 reserves (as of 2007) will be exhausted in about 70 years. The nuclear industry persuades that the cost of fuel makes a minor cost component for fission power. In future, mining of uranium sources could be more expensive, more difficult. However, increasing the price of uranium would have little brought about the overall cost of nuclear power. For instance, a doubling in the cost of natural uranium would increase the total cost of nuclear power by 5 percent. On the other hand, double increasing of natural gas price results in 60 percent growth of the cost of gas-fired power.

The possibility of nuclear meltdowns and other reactor accidents, such as the Three Mile Island accident and the Chernobyl disaster, have caused much public concern. Nevertheless, coal and hydro- power stations have both accompanied by more deaths per energy unit produced than nuclear power generation.

At present, nuclear energy is in decline, according to a 2007 World Nuclear Industry Status Report presented in the European Parliament. The report outlines that the share of nuclear energy in power production decreased in 21 out of 31 countries, with five fewer functioning nuclear reactors than five years ago. Currently 32 nuclear power plants are under construction or in the pipeline, 20 fewer than at the end of the 1990s.

Fusion. Fusion power could solve many of fission power problems. Nevertheless, despite research started in the 1950s, no commercial fusion reactor is expected before 2050. Many technical problems remain unsolved. Proposed fusion reactors commonly use deuterium and lithium as fuel.  Under assumption that a fusion energy output will be kept in the future, then the known lithium reserves would endure 3000 years, lithium from sea water would endure 60 million years. A more complicated fusion process using only deuterium from sea water would have fuel for 150 billion years.

Due to a joint effort of the European Union (EU), America, China, India, Japan, Russia and South Korea a prototype reactor is being constructed on a site in Cadarache (in France). It is supposed to be put into operation by 2018.

Initial projections in 2006 put its price at €10 billion ($13 billion): €5 billion to build and another €5 billion to run and decommission the thing. Since then construction costs alone have tripled.

As the host, the EU is committed to covering 45% of these, with the other partners contributing about 9% each. In May 2010 the European Commission asked member states to conduce an additional €1.4 billion to cope with the project over to 2013. Member states rejected the request.

Sustainability: The environmental movement emphasizes sustainability of energy use and development. “Sustainability” also refers to the ability of the environment to cope with waste products, especially air pollution.

The long-term radioactive waste storage problems of nuclear power have not been fully solved till now. Several countries use underground repositories. Needless to add nuclear waste takes up little space compared to wastes from the chemical industry which remains toxic indefinitely.

Future of nuclear industry. Let us return to how the damage to the Fukushima plant affects future usage of nuclear power in the future in Japan – and in the world.

Share of nuclear electricity production in total domestic production

Nowadays nuclear plants provide about a third of Japan’s electricity (see chart), Fukushima is not the first to be paralysed by an earthquake. But it is the first to be stricken by the technology dependence on a supply of water for cooling.

The 40-year-old reactors in Fukushima run by the Tokyo Electric Power Company faced a disaster beyond anything their designers were required to imagine.

What of the rest of the world? Nuclear industry supporters had hopes of a nuclear renaissance as countries try to reduce carbon emissions. A boom like that of the 1970s is talked, when 25 or so plants started construction each year in rich countries. Public opinion will surely take a dive. At the least, it will be difficult to find the political will or the money to modernise the West ageing reactors, though without modernisation they will not become safer. The heartless images from Fukushima, and the sense of lurching misfortune, will not be forgotten even if final figures unveil little damage to health. France, which has 58 nuclear reactors, seems to see the disaster in Japan as an opportunity rather than an obstacle for its nuclear industry. On March 14th President Nicolas Sarkozy said that French-built reactors have lost international tenders because they are expensive: “but they are more expensive because they are safer.”

However, the region where nuclear power should grow fastest, and seems to be deterred, is the rest of Asia. Two-thirds of the 62 plants under construction in the world are in Asia. Russia plans another ten. By far the most important arising nuclear power is China, which has 13 working reactors and 27 more on the way. China has announced a pause in nuclear commissioning, and a review. But its leaders know that they must go away from coal: the damage to health from a year of Chinese coal-burning plants is bigger then from nuclear industry. And if anyone can build cheap nuclear plants, it is probably the Chinese.

In case the West turns its back on nuclear power and China holds on, the results could be unfortunate. Nuclear plants need trustworthy and transparent regulation.

  References

  • The risks exposed: What the damage to the Fukushima plant portends for Japan—and the world; The Economist, March 19th 2011
  • Expensive Iteration: A huge international fusion-reactor project faces funding difficulties; The Economist, July 22nd 2010  

 

 


Tactical Media and games

December 1, 2010
Leave a Comment

Composed by Galina Vitkova

  

Introductory notes

Tactical media is a form of media activism that uses media and communication technologies for social movement and privileges temporary, hit-and-run interventions in the media sphere. Attempts to spread information not available by mainstream news are also called media activism. The term was first introduced in the mid-1990s in Europe and the United States by media theorists and practitioners. Since then, it has been used to describe the practices of a vast array of art and activist groups. Tactical media also shares something with the hacker subculture, and in particular with software and hardware hacks which modify, extend or unlock closed information systems and technologies.

Tactical Media in Video Games

Video games have opened a fully new approach for tactical media artists. This form of media allows a wide range of audiences to be informed of a specific issue or idea. Some examples of games that touch on Tactical Media are Darfur is Dying and September 12. One example of a game design studio that works in tactical media is TAKE ACTION games (TAG). The video game website www.newsgaming.com greatly embodies the idea of tactical media in video games. Newsgaming coins this name as a new genre that brings awareness of current news related issues based on true world events apposed to fantasy worlds that other video games are based upon. It contributes to emerging culture that is largely aimed at raising awareness about important matters in a new and brilliant approach.

Other examples of tactical media within video games include The McDonald’s Game. The author of this game takes information from the executive officers of McDonalds and giving it to the public by informing people about how McDonalds does its business and what means it uses to accomplish it.

Chris Crawford’s Balance of the Planet, made in 1990, is another example of tactical media, in which the game describes environmental issues.

Darfur is Dying description   

Camp of Darfuris internally displaced by the o...

Image via Wikipedia

Origination

It is a browser game about the crisis in Darfur, western Sudan. The game won the Darfur Digital Activist Contest sponsored by the company mtvU ((Music Television for Universities campus)). Released in April 2006, more than 800,000 people had played it by September. It is classified as a serious game, specifically a newsgame.
The game design was led by Susana Ruiz (then a graduate student at the Interactive Media Program at the School of Cinematic Arts at the University of Southern California) as a part of TAKE ACTION games. In October 2005 she was attending the Games for Change conference in New York City, where mtvU announced that they, in partnership with other organizations, were launching the Darfur Digital Activist Contest for a game. The game should also be an advocacy tool about the situation in the Darfur conflict. Since mtvU offered funding and other resources, Ruiz decided to participate in this project.
Ruiz formed a design team and spent two months creating a game design document and prototype. The team spent much of the design phase talking to humanitarian aid workers with experience in Darfur and brainstorming how to make a game that was both interesting to play and was an advocacy tool. The Ruiz team’s beta version was put up for review by the public, along with the other finalists, and was chosen as the winner. The team then received funding to complete the game. The game was officially released at a Save Darfur Coalition rally on 30 March 2006.
Map of Darfur, Sudan (
Image via Wikipedia

 

Gameplay

The game begins with the player choosing a member of a Darfuri family that has been displaced by the conflict. The first of the two modes of the game begins with the player controlling the family member, who travelled from the camp to a well and back, while dodging patrols of the janjaweed militia. If captured, the player is informed what has happened to his/her selected character and asked to select another member of the family and try again. If the water is successfully carried back to the camp, the game switches into its second mode – a top down management view of the camp, where the character must use the water for crops and to build huts. When the water runs out the player must return to the water fetching level to progress. The goal is to keep the camp running for seven days.

 

Original caption states,

Image via Wikipedia

 Reception of the game

The game has been reported by mainstream media sources such as The Washington Post, Time Magazine, BBC News and National Public Radio. In an early September 2006 interview, Ruiz stated that it was difficult to determine success for a game with a social goal, but affirmed that more than 800,000 people had played it 1.7 million times since its release.  Moreover, tens of thousands of them had forwarded the game to friends or sent a letter to an elected representative. As of April 2007, the game has been played more than 2.4 million times by over 1.2 million people worldwide.

 The game has been the focus of debate on its nature and impact. Some academics, interviewed by the BBC on the game, stated that anything that might spark debate over Darfur and issues surrounding is a clear gain for the advocates. The others thought that the game oversimplified a complex situation and thus failed to address the actual issues of the conflict.  The game was also criticized for the sponsorship of mtvU, raising the possibility that the game might seem like a marketing tool for the corporation. The official site does not use the word “game”, but refers to Darfur is Dying as a “narrative based simulation.”

 

 Related Articles

 


Beware of danger when playing PC games online

October 30, 2010
Leave a Comment
Example of firewall function: Blocking spyware...

Image via Wikipedia

  
By P. B.

The Internet is a place where the user can find a lot of information, entertainment or work, but on the other side, the same user can “catch” viruses, spyware or malware. Many people don´t understand why somebody creates these harmful programs (see the notes below about these programs). However, the answer is easy – similarly in the common life we can meet a lot of people with wicked goals. And gaining money through special programs is an appropriate goal of many Internet thieves. There are various methods how to do it. On the Internet the user may visit some Web pages which contain  viruses or spyware or malware. It can very often happen to the pages with games because games are considered to be typically connected with gamblers and for this reason it can be a source of money.

But the harmful code  may not be only on the Web pages, games themselves can include it.  It means that the player, when downloading some game and installing it on the local computer, also installs the harmful code without any suspicion. It can be very dangerous – one small example. Imagine the user installed the game that involves a so-called keylogger. The key logger is a small program that records stealthily all keys which the user presses. Many antivirus programs consider this software as a virus (usually as a Trojan-horse – see A worm, a virus or a Trojan horse?). So, the key logger writes all pressed keys to the txt file and sends it to the thief´s e-mail. Suppose, after that the user visited his online betting on the http://www.tipsport.cz, where he had to write the text “www.tipsport.cz” following by username “honza” and password “sazeni123”. The key logger put this string of characters in the txt file “www.tipsport.czhonzasazeni123”. The thief received the file, found this text and was very fast able to connect to the honza-account and transferred all the money from Honza´s Internet account to his (thief´s) own account. It was easy, wasn´t it? Of course, the probability of this coincidence is not very high, but who knows.       

Replica of the Trojan Horse in Troy, Turkey

Image by Alaskan Dude via Flickr

 

Notes:

  • Malware means malicious software – authors of malware create programs for harming  other software. Malware includes PS viruses, trojan-horses, spyware and adware.
  • Spyware is a program that uses the Internet for sending data from the computer without awareness of a user of the computer. It differs from the backdoor by a content of sending data, i.e. it sends only statistic data (e.g. overview of visiting pages, installed programs) and can be used for advertising. The spyware is typically widespread in shareware programs and the authors of the shareware know about it and conciliate it because they want to earn money. 
  • Adware, or advertising-supported software, is any software that automatically downloads advertisements to a computer. The goal of the adware is to generate revenue for its author. Adware, by itself, is harmless; but some adware may come with integrated spyware such as keyloggers.

See more in What is Adware, Spyware and Anti-virus?

 

 


Online game playing

October 25, 2010
3 Comments
By  P. B.

There are a lot of servers on the Internet that provide playing games online. The playing is very easy and many users who have only basic knowledge about computers and the Internet can play these games. The most common way of starting to play is to open the Internet and visit the Google page.   Then in the box for searching write two words: online games and Google immediately offers you many servers, e.g. www.onlinegames.net, www.freeonlinegames.com or Czech pages www.super-games.cz etc. Each server proposes many various games of different sorts. There you may find games for boys, girls, kids, most played games, new games, and others. Or you can select games by a subject, i.e. adventure games, sports games, war games, erotic or strategic games, etc.         

Assigning a path for Leviathan

Image by Alpha Auer, aka. Elif Ayiter via Flickr

Many games have own manual how to play, so the second step is to study the manual. Depending on the subject of a game the user must use, for example, the key Right Arrow to go forward, Left Arrow – to go back, PgUp – to go up, Ctrl – to shoot. It is very easy to understand how to play and recognize what is the goal of the game, e.g. to have maximum points, to kill everything that moves or to be the first in the end. These games are rather simple-minded, but some people become too addicted to them trying to improve their best performance. Sometimes they spend hours before the screen every day and don´t have any idea about time.  

I have tried four different servers and about six different games. In my opinion these games are very easy and for me boring, but for younger users or for people who are bored right now the games can be interesting. However, the most important thing (in my view) is that two of tested servers were infected (my computer warned me that the pages are dangerous and can contain malware, spyware or viruses). My friends, who have problems with their computers in this sense, want me to repair their computer – maybe that is the reason why I don’t like playing games online directly on the Internet.

Quake3 + net_server
Image by [Beta] via Flickr

 

On the other side, I have also tried the game Quake 3 (game demo – not through the Internet, but after installing this game on my computer) and I can affirm that it was  pretty interesting.

 

Quake 3 Arena is a really shooting game. There is no other goal than to kill all other players (but in other versions like Team death match or Capture the flag two teams fight against each other). The player can choose the level of demandingness (from easy to hard) and various places. Quake 3 Arena is the mode where the player fights in the Arena against computer controlled bots (Artificial Intelligent fighters). 

The fighters do battle equipped with various weapons as follows:

  • Gauntlet – a basic weapon for very near fight, usually used only when the player does not have other gun;
  • Machinegun – a thin gun, again applied only when a better gun is not in equipment;
  • Shotgun – a weapon for near fight, 1 shoot every 1 second;
  • Grenade Launcher – shoots grenades;
  • Rocket Launcher – a very popular weapon because its usage is very easy and impact is huge; But the flight of a rocket is slow, so the players get used to shooting at the wall or floor because the rocket has big dispersion;
  • Lighting Gun – an electric gun, very effective because can kill the rival in 2 seconds;
  • Rail gun – a weapon for long distance, very accurate, but has short frequency;
  • Plasma Gun – shoots plasma pulse;
  • BFG10K – the most powerful weapon, but the worst-balanced, and for this reason is not often used by players (BFG = Bio Force Gun).

It is important for the players to find and acquire the armor – the maximum is 200 points armor. The armor provides protection, which absorbs 2/3 of damage. Similarly the players can control their health (from the beginning they have 125 points, which make 100%, and can reach maximum 200 points).

Sometimes (depending on the game) additional features are involved – a Battle suit, Haste (makes movement and shooting twice faster within 30 seconds), Invisibility (for 30 seconds), Medkit, Teleporter (the player is moved to a casual place), Regeneration, Flight (during 60 seconds) and so on.  

  

 


Video Games Platforms

October 6, 2010
Leave a Comment
  
Composed by Galina Vitkova

 

Terminology

The term game platform refers to the particular combination of electronic or computer hardware which, in connection with low-level software, allows a video game to run. In general, a hardware platform means a group of compatible computers that can run the same software. A software platform comprises a major piece of software, as an operating system, operating environment, or a database, under which various smaller application programs can be designed to run. Below main platforms of video games are reviewed.   

  

Platforms for PC games 

PC games often require specialized hardware in the user’s computer in order to play, such as a specific generation of graphics processing unit or an Internet connection for online play, although these system requirements vary from game to game. In any case your PC hardware capabilities should meet minimum hardware requirements established for particular PC games. On the other side, many modern computer games allow, or even require, the player to use a keyboard and mouse simultaneously without demanding any additional devices. 

As of the 2000s, PC games are often regarded as offering a deeper and more complex experience than console games. 

 

Video game consoles platform

A video game console is an interactive entertainment computer or modified computer system that produces a video display signal which can be used with a display device to show video games.    

Usually, this system is connected to a common television set or composite video monitor. A composite monitor is any analog video display that receives input in the form of an analog composite video signal through a single cable. The monitor is different from a conventional TV set because it does not have an internal RF (Radio Frequency) tuner or RF converter. However, a user can install an external device that emulates a TV tuner. 

  

Handheld game consoles platform

A handheld game console is a lightweight, portable electronic device of a small size with a built-in screen, games controls and speakers. A small size allows people to carry handheld game consoles and play games at any time or place. 

A One Station handheld console with game

Image via Wikipedia

 The oldest true handheld game console with interchangeable cartridges is the Milton Bradley Microvision issued in 1979. 

Nintendo, with a popular handheld console concept released the Game Boy in 1989, and continues to dominate the handheld console market with successive Game Boy, and most recently Nintendo DS models.  

  

Handheld electronic games platform

In the past decade, handheld video games have currently become a major sector of the video game market. For example, in 2004 sales of portable software titles exceeded $1 billion in the United States. 

The Gizmondo handheld video game unit. United ...

Image via Wikipedia

Handheld electronic games are very small portable devices for playing interactive electronic games, often miniaturized versions of video games. The controls, display and speakers are all a part of a single unit. They usually have displays designed to play one game. Due to this simplicity they can be made as small as a digital watch, and sometimes are. Usually they do not have interchangeable cartridges, disks, etc., or are not reprogrammable.  The visual output of these games can range from a few small light bulbs or a light-emitting diode (LED) lights to calculator-like alphanumerical screens. Nowadays these outputs are mostly displaced by liquid crystal and Vacuum fluorescent display screens. Handhelds were most popular from the late 1970s into the early 1990s. They are both the precursors and inexpensive alternatives to the handheld game console. 

Mobile games platform

A mobile game is a video game played on a mobile phone, smartphone, PDA (Personal Digital Assistant), handheld computer or portable media player.  

The 16 best iPhone games of 2009

Image by docpop via Flickr

The first game that was pre-installed onto a mobile phone was Snake on selected Nokia models in 1997. Snake and its variants have since become the most-played video game on the planet, with over a billion people having played the game. Mobile games are played using the technologies present on the device itself. The games may be installed over the air, they may be side loaded onto the handset with a cable, or they may be embedded on the handheld devices by the original equipment manufacturer (OEM) or by the mobile operator. 

For networked games, there are various technologies in common use, for example, text message (SMS), multimedia message (MMS) or GPRS location identification. 

  

Arcade games 

The Simpsons arcade game by Konami

Image by Lost Tulsa via Flickr

An Arcade game is a coin-operated entertainment machine, usually installed in public businesses such as restaurants, public houses, and video arcades. Most arcade games are redemption games, merchandisers (such as claw crane), video games, or pinball machines. The golden age of video arcade games within the early 1980s was a peak era of video arcade game popularity, innovation, and earnings.     

Furthermore, by the late 1990s and early 2000s, networked gaming via console and computers across the Internet had appeared and replaced arcade games. The arcades also lost their a forefront position of the of new game releases. Having the choice between playing a game at an arcade three or four times (perhaps 15 minutes of play for a typical arcade game), and renting, at about the same price, the exact same game for a video game console, people selected the console. To remain viable, arcades added other elements to complement the video games such as redemption games, merchandisers, games that use special controllers largely inaccessible to home users. Besides, they equiped games with  reproductions of automobile or airplane cockpits, motorcycle or horse-shaped controllers, or highly dedicated controllers such as dancing mats and fishing rods. Moreover, today arcades extended their activities by food service etc. striving to become “fun centers” or “family fun centers”. 

All modern arcade games use solid state electronics and integrated circuits. In the past coin-operated arcade video games generally used custom per-game hardware often with multiple CPUs, highly specialized sound and graphics chips, and the latest in computer graphics display technology. Recent arcade game hardware is often based on modified video game console hardware or high-end PC components.

References:   http://en.wikipedia.org/

 

 


Next Page »

    August 2018
    M T W T F S S
    « Jul    
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  

    Blog Stats

    • 203,979 hits

    Subscribe with BlogLines

    Translatorsbase

    Dynamic blog-up

    technorati

    Join the discussion about

    Seomoz

    I <3 SEO moz