Whether migrating a database or project from MySQL to PostgreSQL, or choosing PostgreSQL for a new project with only MySQL knowledge, there are a few things to know about PostgreSQL and the differences between the two database systems.
PostgreSQL is a fully open source database system released under its own license, the PostgreSQL License, which is described as "a liberal Open Source license, similar to the BSD or MIT licenses.” This has allowed The PostgreSQL Global Development Group (commonly referred to as PGDG), who develops and maintains the open source project, to improve the project with help from people around the world, turning it into one of the most stable and feature rich database solutions available. Today, PostgreSQL competes with the top proprietary and open source database systems for features, performance, and popularity.
PostgreSQL is a highly compliant Relational Database System that’s scalable, customizable, and has a thriving community of people improving it every day.
What PostgreSQL Needs
In a previous blog, we discussed setting up and optimizing PostgreSQL for a new project. It is a good introduction to PostgreSQL configuration and behavior, and can be found here: https://severalnines.com/blog/setting-optimal-environment-postgresql.
If migrating an application from MySQL to PostgreSQL, the best place to start would be to host it on similar hardware or hosting platform as the source MySQL database.
If hosting the database on premise, bare metal hosts (rather than Virtual Machines) are generally the best option for hosting PostgreSQL. Virtual Machines do add some helpful features at times, but they come at the cost of losing power and performance from the host in general, while bare metal allows the PostgreSQL software to have full access to performance with fewer layers between it and the hardware. On premise hosts would need an administrator to maintain the databases, whether it’s a full time employee or contractor, whichever makes more sense for the application needs.
In The Cloud
Cloud hosting has come a long way in the past few years, and countless companies across the world host their databases in cloud based servers. Since cloud hosts are highly configurable, the right size and power of host can be selected for the specific needs of the database, with a cost that matches.
Depending on the hosting option used, new hosts can be provisioned quickly, memory / cpu / disk can be tweaked quickly, and even additional backup methods can be available. When choosing a cloud host, look for whether a host is dedicated or shared, dedicated being better for extremely high load databases. Another key is to make sure the IOPS available for the cloud host is good enough for the database activity needs. Even with a large memory pool for PostgreSQL, there will always be disk operations to write data to disk, or fetch data when not in memory.
Since PostgreSQL is increasing in popularity, it’s being found available on many cloud database hosting services like Heroku, Amazon AWS, and others, and is quickly catching up to the popularity of MySQL. These services allow a third party to host and manage a PostgreSQL database easily, allowing focus to remain on the application.
Concepts / term comparisons
There are a few comparisons to cover when migrating from MySQL to PostgreSQL, common configuration parameters, terms, or concepts that operate similarly but have their differences.
Various database terms can have different meanings within different implementations of the technology. Between MySQL and PostgreSQL, there’s a few basic terms that are understood slightly differently, so a translation is sometimes needed.
In MySQL, a ‘cluster’ usually refers to multiple MySQL database hosts connected together to appear as a single database or set of databases to clients.
In PostgreSQL, when referencing a ‘cluster’, it is a single running instance of the database software and all its sub-processes, which then contains one or more databases.
In MySQL, queries can access tables from different databases at the same time (provided the user has permission to access each database).
SELECT * FROM customer_database.customer_table t1 JOIN orders_database.order_table t2 ON t1.customer_id = t2.customer_id WHERE name = ‘Bob’;
However in PostgreSQL this cannot happen unless using Foreign Data Wrappers (a topic for another time). Instead, a PostgreSQL database has the option for multiple ‘schemas’ which operate similarly to databases in MySQL. Schemas contain the tables, indexes, etc, and can be accessed simultaneously by the same connection to the database that houses them.
SELECT * FROM customer_schema.customer_table t1 JOIN orders_schema.order_table t2 ON t1.customer_id = t2.customer_id WHERE name = ‘Bob’;
Interfacing with the PostgreSQL
In the MySQL command line client (mysql), interfacing with the database uses key works like ‘DESCRIBE table’ or ‘SHOW TABLES’. The PostgreSQL command line client (psql) uses its own form of ‘backslash commands’. For example, instead of ‘SHOW TABLES’, PostgreSQL’s command is ‘\dt’, and instead of ‘SHOW DATABASES;’, the command is ‘\l’.
A full list of commands for ‘psql’ can be found by the backslash command ‘\?’ within psql.
Like MySQL, PostgreSQL has libraries and plugins for all major languages, as well as ODBC drivers along the lines of MySQL and Oracle. Finding a great and stable library for any language needed is an easy task.
Unlike MySQL, PostgreSQL has a wide range of supported Procedural Languages to choose from. In the base install of PostgreSQL, the supported languages are PL/pgSQL (SQL Procedural Language), PL/Tcl (Tcl Procedural Language), PL/Perl (Perl Procedural Language), and PL/Python (Python Procedural Language). Third party developers may have more languages not officially supported by the main PostgreSQL group.
MySQL tunes this with key_buffer_size when using MyISAM, and with innodb_buffer_pool_size when using InnoDB.
PostgreSQL uses shared_buffers for the main memory block given to the database for caching data, and generally sticks around 1/4th of system memory unless certain scenarios require that to change. Queries using memory for sorting use the work_mem value, which should be increased cautiously.
Tools for migration
Migrating to PostgreSQL can take some work, but there are tools the community has developed to help with the process. Generally they will convert / migrate the data from MySQL to PostgreSQL, and recreate tables / indexes. Stored Procedures or functions, are a different story, and usually require manual re-writing either in part, or from the ground up.
Some example tools available are pgloader and FromMySqlToPostgreSql. Pgloader is a tool written in Common Lisp that imports data from MySQL into PostgreSQL using the COPY command, and loads data, indexes, foreign keys, and comments with data conversion to represent the data correctly in PostgreSQL as intended. FromMySqlToPostgreSql is a similar tool written in PHP, and can convert MySQL data types to PostgreSQL as well as foreign keys and indexes. Both tools are free, however many other tools (free and paid) exist and are newly developed as new versions of each database software are released.
Converting should always include in depth evaluation after the migration to make sure data was converted correctly and functionality works as expected. Testing beforehand is always encouraged for timings and data validation.
If coming from MySQL where replication has been used, or replication is needed at all for any reason, PostgreSQL has several options available, each with its own pros and cons, depending on what is needed through replication.
By default, PostgreSQL has its own built in replication mode for Point In Time Recovery (PITR). This can be set up using either file-based log shipping, where Write Ahead Log files are shipped to a standby server where they are read and replayed, or Streaming Replication, where a read only standby server fetches transaction logs over a database connection to replay them.
Either one of these built in options can be set up as either a ‘warm standby’ or ‘hot standby.’ A ‘warm standby’ doesn’t allow connections but is ready to become a master at any time to replace a master having issues. A ‘hot standby’ allows read-only connections to connect and issue queries, in addition to being ready to become a read/write master at any time as well if needed.
One of the oldest replication tools for PostgreSQL is Slony, which is a trigger based replication method that allows a high level of customization. Slony allows the setup of a Master node and any number of Replica nodes, and the ability to switch the Master to any node desired, and allows the administrator to choose which tables (if not wanting all tables) to replicate. It’s been used not just for replicating data in case of failure / load balancing, but shipping specific data to other services, or even minimal downtime upgrades, since replication can go across different versions of PostgreSQL.
Slony does have the main requirement that every table to be replicated have either a PRIMARY KEY, or a UNIQUE index without nullable columns.
When it comes to multi-master options, Bucardo is one of few for PostgreSQL. Like Slony, it’s a third party software package that sits on top of PostgreSQL. Bucardo calls itself “an asynchronous PostgreSQL replication system, allowing for both multi-master and multi-slave operations.” The main benefit is multi-master replication, that works fairly well, however it does lack conflict resolution, so applications should be aware of possible issues and fix accordingly.
There are many other replication tools as well, and finding the one that works best for an application depends on the specific needs.
PostgreSQL has a thriving community willing to help with any issues / info that may be needed.
An active IRC chatroom named #postgresql is available on freenode, as administrators and developers world wide chat about PostgreSQL and related projects / issues. There’s even smaller rooms for specifics like Slony, Bucardo, and more.
There are a handful of PostgreSQL mailing lists for ‘general’, ‘admin’, ‘performance’, and even ‘novice’ (a great place to start if new to PostgreSQL in general). The mailing lists are subscribed to by many around the world, and provide a very useful wealth of resources to answer any question that may need answering.
A full list of PostgreSQL mailing lists can be found at https://www.postgresql.org/list/
User groups are a great place to get involved and active in the community, and many large cities worldwide have a PostgreSQL User Group (PUG) available to join and attend, and if not, consider starting one. These groups are great for networking, learning new technologies, and even just asking questions in person to people from any level of experience.
Most Importantly, PostgreSQL is documented very well. Any information for configuration parameters, SQL functions, usage, all can be easily learned through the official documentation provided on PostgreSQL’s website. If at all anything is unclear, the community will help in the previous outlined options.
The 2018 StackOverflow survey has just been published, with good news for PostgreSQL.
StackOverflow got more than 100,000 responses from people in a comprehensive 30 minute survey.
PostgreSQL is the third most commonly used database, with 33% of respondents, slightly behind MySQL and SQLServer, yet well ahead of other options. Early in January, the DBEngines results showed PostgreSQL in 4th place behind Oracle, yet here we see that actually Oracle heads up the Most Dreaded list along with DB2, leaving PostgreSQL to power through to 3rd place.
PostgreSQL at 62% is the second most loved database, so close behind Redis (on 64%) that they’re almost even. But then Redis is only used by 18.5% of people and its very much a different beast anyway – yes, its a datastore, but not a full functioned database like PostgreSQL and others.
Notice that neither MySQL nor SQLServer are well loved, yet enough people use them that we can be pretty certain of that as a collective opinion.
Later we learn that SQLServer has a strong correlation with C# and that MySQL has a strong correlation with PHP/HTML/CSS/WordPress, so they are both the main database choice for those software stacks. What’s interesting there is that PostgreSQL doesn’t have any correlation towards Java, Python, Ruby etc. Or if I might interpret that differently, it is equally popular amongst developers from all languages who aren’t already using LAMP or MS stacks.
Later we learn that 57.5% of people love SQL, which is pretty much everyone that uses it.
We’ll do some more analysis when the anonymized data is available, just to double check these analyses.
In our previous article we saw three classic Database Modelization Anti-Patterns. The article also contains a reference to a Primary Key section of my book Mastering PostgreSQL in Application Development, so it’s only fair that I would now publish said Primary Key section!
So in this article, we dive into Primary Keys as being a cornerstone of database normalization. It’s so important to get Primary Keys right that you would thing everybody knows how to do it, and yet, most of the primary key constraints I’ve seen used in database design are actually not primary keys at all.
This is the March 2018 issue of the monthly DB-Engines Ranking of database management systems.
You can find the complete and most up-to-date ranking at DB-Engines.com.
|3.||Microsoft SQL Server||1104.79||-17.25|
|7.||Microsoft Access||131.95||+ 1.88|
|Copyright © March 2018 DB-Engines.com|
Offensiveness or wrong-headedness hurt no one. The claim that they do is designed to shut down legitimate debate.
The proper antidote to speech that offends is other speech. Opinion anchored in reason can be countered by other opinion. Lies can be exposed by factual evidence. Truth emerges from debate and disagreement.
The only sort of speech that deserves to be banned, on campus or elsewhere, is that which peddles true prejudice. This means speech that attacks people on the basis of an irrational hatred which by definition is immune to reasoned argument. Defending colonialism is an opinion. Saying black people are inferior is bigotry.
To read my whole Times column (£), please click here.
As with many stories, if you have something to tell, it quickly takes up a lot of space. Therefor this will be a series of blog posts on Postgres and a bit of Oracle. It will be a short series, though…
After switching to a more novel system, we adopted DEC (Digital Equipment Corporation) VAX, VMS and Micro VAX systems! Arguably still the best operating system around… In any case, it brought us the ability to run, the only valid alternative for a database around, Oracle. With a shining Oracle version 6.2 soon to be replaced by version 7.3.4. Okay, truth be said, at that time I wasn’t really that deep into databases, so much of the significance was added later. My primary focus was on getting the job done, serving the business in making people better. Still working with SQL and analyzing data soon became of my hobbies.
Some rain has to fall
This disconnect was the first one. Moving forward I noticed and felt more and more of a disconnect between Oracle and, what I like to call, core technology. Call me what you will, I feel that if you want to bring a database to the market and want to stay on top of your game, you focus needs to be at least seriously focused on that database.
Instead we saw ever more focus for “non-core” technology. Oracle Fusion, Oracle Applications (okay, Oracle Apps had been there always), and as time progressed, the dilution became ever greater. I grew more and more in the believe that Oracle didn’t want to be that Database Company anymore (which proved to be true in the), but it was tough for me to believe. Here I was, having spend most of my active career focused on this technology, and now it was derailing (as it felt to me).
We saw those final things, with the elimination of Oracle Standard Edition One, basically forcing a entire contingent of their customers either out (too expensive) or up (invest in Oracle Standard Edition Two, and deal with more cost for less functionality). What appeared to be a good thing, ended up leaving a bad taste in the mount.
And, of course… the Oracle Cloud, I am not even going to discuss that in this blog-post, sorry.
The switch to Postgres
For me the switch was in two stages. First, there was this situation that I was looking for something to do… I had completed my challenge and, through a good friend, ran into the kind people of EnterpriseDB. A company I only had little knowledge of doing stuff for PostgreSQL (or Postgres if you like, please, no Postgré or things alike please, find more about the project here), a database I had not so much more knowledge of. But, their challenge was very interesting! Grow and show Postgres and the good things it brings to the market.
Before I knew it, I was studying Postgres and all the things that Postgres brings. Which was easy enough in the and, as the internal workings and structures of Postgres and Oracle differ not much. I decided to do a presentation on the differences between Postgres and Oracle in Riga. I was kindly accepted by the committee even when I told them, my original submission had changed!
A very good experience, even today, but with an unaccepted consequence. -> The second part of the switch was Oracle’s decision to cut me from the Oracle ACE program.
It does free me up, somehow, to help database users across Europe, re-evaluate their Oracle buy-in and lock-in. Look at smarter and (much) more (cost)-effective ways to handle their database workloads. This finalized “the switch”, so to speak.
Meanwhile more and more people are realizing that there actually are valid alternatives to the Oracle database. After the adoption of the Oracle database as the only serious solution back in the early 1990’s, the world has changed, also for serious database applications!
End of Part I
A link to the follow-up blog post will be placed here shortly.
I would like to take this chance to announce the immediate availability of SWORD release 1.8.0. I know this release has been a long time in coming, but the long time comes with lots of benefits for users, developers, and maintainers. The benefits to users and developers are mentioned elsewhere, throughout the code and other places. The main benefit to maintainers is that, now, there are automated tests in place and the release process is now automated. This means that future releases on that 1.8 branch can be easily executed whenever needed. Have a Merry Christmas, everyone! And keep your eyes open for a 1.8.1 in the not too distant future to fix up buildings in the binding code. Otherwise, you can get the code you're looking for below: MD5: 095dbd723738c2a232c85685a11827a8 sword-1.8.0.tar.gz SHA512: c45f3135255322a77e955297997db2529f31b397c42cc4b9474dc6ec8d329b2233b292078979de5fbf33cad4a1a607aabb66f86501072a729d68e9fc840c8c8e sword-1.8.0.tar.gz URL: sword-1.8.0.tar.gz --Greg
Many people are understandably baffled by the recent UN vote condemning President Trump’s recognition of Jerusalem as Israel’s capital. Since such a vote has zero practical effect, they ask, what was the point of it?
Well indeed. As the American ambassador to the UN Nikki Haley said in her barnstorming response, America will still be moving its embassy to Jerusalem regardless of the UN’s opinion.
The resolution didn’t need to have any practical import. It was merely part of the UN’s theatre of hatred, the malevolent campaign it has waged for decades against Israel and Israel alone as a result of the preponderance of tyrannies, dictatorships, kleptocracies and genocidal antisemitic regimes that make up what’s called called the UN’s “non-aligned block” and which are united in their desire that Israel should be wiped off the map.
So egregious is this hypocrisy in singling out Israel, the sole democracy and upholder of human rights in the region while ignoring the brutal and murderous record of those tyrannies, dictatorships, kleptocracies and genocidal antisemitic regimes, that even a CNN correspondent has been moved to call this out. Jake Tapper tweeted: “Among the 128 countries that voted in favor of the UN resolution condemning the US decision to move the Israeli embassy to Jerusalem were “some countries with some rather questionable records of their own”.
You don’t say. The shocking thing is that so many democratic nations voted alongside these tyrannies: nations such as Germany, Belgium, Ireland, Italy, Luxembourg, the Netherlands, most disappointingly India and, most sickening (to me, anyway), the UK.
Britain, the historic cradle of liberty and democracy and which once fought to defend freedom, has now made common cause with China, Iran, Libya, North Korea and Russia in their joint aim of denying the right of the Jewish people to declare, in accordance with law and history, the capital city of their own country, a right the UK and these other states would deny to no other people or state. What a disgrace.
What on earth did the UN think it was doing? What does Britain’s Prime Minister Theresa May think she’s doing? Does nobody in the British government have a clue about upholding international law or sovereignty? For the real point about this UN vote was that, on this occasion, the principal target wasn’t actually Israel. It was America, and its sovereign right to govern itself. The UN was telling the United States it was not entitled to conduct its own foreign policy in the way it thinks fit.
As Brook Goldstein of the Lawfare Project has observed, this contravenes the UN’s own charter:
“Article 2(7) of the UN Charter is crystal clear: ‘Nothing contained in the present Charter shall authorize the United Nations to intervene in matters which are essentially within the domestic jurisdiction of any state.’ Today’s General Assembly resolution is therefore extralegal and transparently political.
“The UN was built on the principle of respect for the sovereignty of member states (known legally as complementarity), with full awareness that independent nations of the world must make policy decisions in the best interests of their domestic constituencies. The moment the institution begins to attack that very sovereignty is the moment the UN loses all credibility, authority and international deference.”
That’s why most significant part of Nikki Haley’s response was where she said this:
“The United States will remember this day in which it was singled out for attack in the General Assembly for the very act of exercising our right as a sovereign nation. We will remember it when we are called upon to once again make the world’s largest contribution to the United Nations. And we will remember it when so many countries come calling on us, as they so often do, to pay even more and to use our influence for their benefit.”
For decades, the UN’s malicious double standard in repeatedly singling out Israel for condemnation has constituted the negation of its foundational ideals of global justice and peace. The UN has become instead the world’s principal engine of institutionalised Jew-hatred. Now it has crossed another line altogether. The Jerusalem vote could just be the point at which a US President finally decides that America’s tolerance towards the malign global incubus that the UN has become is now at an end.
In an interview with the DefesaNet, Mikael Franzén, Director and head of Business Unit Gripen Brazil, Saab, throws light on the Gripen E/F (called NG in Brazil) project so far and plans for the future.
“The most important thing that took place this year was the first flight of the 39-8 aircraft, and it was very successful,” he says.
Other than that, the next big achievement was the development of the Wide Area Display (WAD) for Gripen. WAD will have the basic display software by AEL and tactical software by Saab.
About the next main goals, he says that Saab is currently building a test aircraft which is aimed to fly in 2019. "In the meantime, we will perform tests on various subsystems for the fighter jets – it’s going to be a very intense test period. We are also going to work on furthering the development of the tactical system for the aircraft, as well as the two-seat version, which is still in its early stages," he says.
Post the first test flight, Saab has been in a phase of experimenting with speed and altitudes, various external loads etc.
Development work for the new cockpit is also set to start soon.
Read the full interview here.
For more than two years the Free Software Foundation Europe has worked on the issue of Radio Lockdown introduced by a European directive which may hinder users to load software on their radio devices like mobile phones, laptops and routers. We have informed the public and talked to decision makers to fix critical points of the directive. There is still much to do to protect freedom and IT security in our radio devices. Read about the latest proceedings and the next steps.
In 2014, the European Parliament passed the Radio Equipment Directive which, among other regulations, make vendors of radio hardware responsible for preventing users from installing software which may alter the devices' radio parameters to break applicable radio regulations. While we share the desire to keep radio frequencies clean, the directive's approach will have negative implications on users' rights and Free Software, fair competition, innovation and the environment – mostly without equal benefits for security.[R]adio equipment [shall support] certain features in order to ensure that software can only be loaded into the radio equipment where the compliance of the combination of the radio equipment and software has been demonstrated. – Article 3(3)(i) of the Radio Equipment Directive 2014/53/EU
This concern is shared by more than 50 organisations and businesses which signed our Joint Statement against Radio Lockdown, a result of our ongoing exchange and cooperation with the Free Software community in Europe and beyond.
The Radio Equipment Directive was put in effect in June 2017, but the classes of devices affected by the controversial Article 3(3)(i), which causes the Radio Lockdown, have not yet been defined. This means the directive doesn't concern any existing hardware yet. The definition of what hardware devices are covered will be decided on by the European Commission through a delegated act and is expected to be finished at the earliest by the end of 2018.The Commission shall be empowered to adopt delegated acts in accordance with Article 44 specifying which categories or classes of radio equipment are concerned by each of the requirements [...] – Article 3(3), paragraph 2 of 2014/53/EU
However, that list is already being prepared in the Expert Group on Reconfigurable Radio Systems, a body of member state authorities, organisations, and individuals whose task is to assist the European Commission with drafting the delegated acts to activate Article 3(3)(i). The FSFE applied to become a member of this committee but was rejected. The concerns that the members of the Expert Group do not sufficiently represent the civil society and the broad range of software users has also been raised during a recent meeting in the European Parliament.
Nevertheless, we are working together with organisations and companies to protect user freedoms on radio devices and keep in touch with members of the expert group. For example, we have shared our expertise for case studies and impact assessments drafted by the group members. We are also looking forward to a public consultation phase to officially present our arguments and improvement suggestions and allow other entities to share their opinion.
All our activities aim to protect Free Software and user rights on current and future radio devices. This is more important than ever since only a few members of the expert group seem to understand the importance of loading software on radio devices for IT security, for example critical updates on hardware which is not or only sporadically maintained by the original vendor. We will continue our efforts to make decision makers understand that Free Software (a.k.a. Open Source Software) is crucial for network security, science, education, and technical innovation. Therefore, broad exceptions in the class definition are necessary.
Conducting such lengthy policy activities requires a lot of resources for non-profit organisations like the FSFE. Please consider helping us by joining as an individual supporter today or a corporate donor to enable our work.
The Dutch Ministry of the Interior and Kingdom Relations released the source code and documentation of Basisregistratie Personen (BRP), a 100M€ IT system that registers information about inhabitants within the Netherlands. This comes as a great success for Public Code, and the FSFE applauds the Dutch government's shift to Free Software.
Operation BRP is an IT project by the Dutch government that has been in the works since 2004. It has cost Dutch taxpayers upwards of 100 million Euros and has endured three failed attempts at revival, without anything to show for it. From the outside, it was unclear what exactly was costing taxpayers so much money with very little information to go on. After the plug had been pulled from the project earlier this year in July, the former interior minister agreed to publish the source code under pressure of Parliament, to offer transparency about the failed project. Secretary of state Knops has now gone beyond that promise and released the source code as Free Software (a.k.a. Open Source Software) to the public.
In 2013, when the first smoke signals showed, the former interior minister initially wanted to address concerns about the project by providing limited parts of the source code to a limited amount of people under certain restrictive conditions. The ministry has since made a complete about-face, releasing a snapshot of the (allegedly) full source code and documentation under the terms of the GNU Affero General Public License, with the development history soon to follow.
In a letter to Dutch municipalities earlier in November, secretary of state Knops said that he is convinced of the need of an even playing field for all parties, and that he intends to "let the publication happen under open source terms". He went on to say: "What has been realised in operation BRP has namely been financed with public funds. Software that is built on top of this source code should in turn be available to the public again."
These statements are an echo of the Free Software Foundation Europe's Public Money, Public Code campaign, in which we implore public administrations to release software funded by the public as Free Software available to the citizenry that paid for it.
The echoes of 'Public Money, Public Code' do not stop there. In a letter to the Dutch parliament Wednesday 29 November, the secretary of state writes about the AGPL: "The license terms assure that changes to the source code are also made publicly available. In this way, reuse is further supported. The AGPL offers the best guarantee for this, and besides the GPL (General Public License), sees a lot of use and support in the open source community.
"Publication will happen free of charge so that, in the public interest, an even playing field is created for everyone who wants to reuse this code."
This is big news from the Netherlands and an unprecedented move of transparency by the Dutch government. Following a report to the Ministry of the Interior about publishing government software as Free Software (Open Source Software), it seems that this will happen more often. In it, Free Software is described as making the government more transparent, lowering costs, increasing innovation, forming the foundation for a digital participation society, and increasing the quality of code.
"We applaud the Dutch government for releasing the source code for BRP. We have been asking for this method of working since 2001, and it is good to see that the government is finally taking steps towards Free Software. In the future, we hope that the source code will be released during an earlier stage of development, which we believe in this case would have brought issues to light sooner", says Maurice Verheesen, coordinator FSFE Netherlands.
If you like our campaign "Public Money, Public Code", please become a supporter today to enable our work!
I have read very interesting post “Code Quality Comparison of Firebird, MySQL, and PostgreSQL” today about static analysis of three open-source RDBMS. And I wonder, should we use static code analyzers on an ongoing basis, e.g. PVS Studio?
So, the code-quality rankings are as follows:
- 1 place – Firebird and PostgreSQL.
- 2 place – MySQL.
Please remember that any review or comparison, including this one, is subjective. Different approaches may produce different results (though it is mostly true for Firebird and PostgreSQL, but not for MySQL).
Filed under: Coding, MySQL, PostgreSQL Tagged: Coding, MySQL, PostgreSQL
The surprise publishing bestseller of the autumn is the slim volume The Secret Life of Cows. The author, Rosamund Young, chronicles the life of Stephanie, Ivor, Olivia, Alice, Jake and the rest of the herd on her Worcestershire organic farm. She regards every bovine as an individual with a distinctive character and a full range of emotions and experiences. Their lives, she writes, are as full and varied as our own.
The secret life of cows may be richer than we realise. The secret life of humans, however, is more brutish than we care to admit. Ascribing human characteristics to animals won’t mean treating them more kindly. It means treating human beings rather worse.
To read my whole Times column (£), please click here.
On 6 October, 32 European Ministers in charge of eGovernment policy signed the Tallinn Declaration on eGovernment that calls for more collaboration, interoperable solutions, and sharing of good practices throughout public administrations and across borders. Amongst other things, the EU ministers recognised the need to make more use of Free Software solutions and Open Standards when (re)building governmental digital systems with EU funds.
The Tallinn Declaration, lead by the Estonian EU presidency, has been adopted on 6 October 2017. It is a ministerial declaration that marks a new political commitment at European Union (EU) and European Free Trade Area (EFTA) level on priorities to ensure user-centric digital public services for both citizens and businesses cross-border. While having no legislative power, the ministerial declaration marks a political commitment to ensure the digital transformation of public administrations through a set of commonly agreed principles and actions.
The FSFE has previously submitted its input for the aforementioned declaration during the public consultation round, asking for greater inclusion of Free Software in delivering truly inclusive, trustworthy and interoperable digital services to all citizens and businesses across the EU.
The adopted Tallinn Declaration proves to be a forward-looking document that acknowledges the importance of Free Software in order to ensure the principle of 'interoperability by default', and expresses the will of all signed EU countries to:
"make more use of open source solutions and/or open standards when (re)building ICT systems and solutions (among else, to avoid vendor lock-ins)[...]"
Additionally, the signatories call upon the European Commission to:
"consider strengthening the requirements for use of open source solutions and standards when (re)building of ICT systems and solutions takes place with EU funding, including by an appropriate open licence policy – by 2020."
The last point is especially noteworthy, as it explicitly calls for the European Commission to make use of Free Software and Open Standards in building their ICT infrastructure with EU funds, which is in line with our "Public Money, Public Code" campaign that is targeted at the demand for all publicly financed software developed for the public sector to be publicly made available under Free Software licences.What's next?
The Tallinn Declaration sets several deadlines for its implementation in the next few years: with the annual presentation on the progress of implementation of the declaration in the respective countries across the EU and EFTA through the eGovernment Action Plan Steering Board. The signatories also called upon the Austrian Presidency of the Council of the EU to evaluate the implementation of the Tallinn Declaration in autumn 2018.
"The Declaration expresses the political will of the EU and EFTA countries to digitise their governments in the most user-friendly and efficient way. The fact that it explicitly recognises the role of Free Software and Open Standards for a trustworthy, transparent and open eGovernment on a high level, along with a demand for strengthened reuse of ICT solutions based on Free Software in the EU public sector, is a valuable step forward to establishing a "Public Money, Public Code" reality across Europe", says Polina Malaja, the FSFE's policy analyst.
Whatever you think about Israel’s PM Benjamin Netanyahu, this is a master class in how to present the case, not just for Israel but for rational western policy on the manifold and gathering threats to the world within the Middle East – of which the overwhelming threat by far is posed by Iran. Watch how he calmly copes with the usual boiler-plate prejudices about the Palestinians and then makes the points that so badly need to be made to a British audience – such as pointing out that Israel has helped save many British lives. Not a fact that the British often hear.
I have attended 323 Postgres events in my career. While I have enjoyed almost all of them, many had different focuses, so I thought I would share my experiences. First, there are a variety of conference types:
- Vendor conferences: often in big cities, which focus on company-produced products
- Business conferences: also often in big cities, which focus on business challenges and discussions, often with high attendance prices
- Community conferences: led by people who care about open-source software and focus on software knowledge transfer
- Hobbyist conferences: often in smaller cities, which focus on interpersonal relationship building with technology as a catalyst, often free
It would be nice if I could say which conference types are good or bad, but that isn't possible. Each conference targets an audience whose needs it seeks to fulfill. Let's look at the needs that each fulfills:
- Vendor conferences: If you are new to a technology and need people to help you navigate purchase options, these conferences are for you.
- Business conferences: If you are frequently challenged to make business decisions, but feel you have no one to share options with or brainstorm, this type of conference can give you a framework to help you make your next complex business decision.
- Community conferences: If you spend significant time solving technological problems, you can gain great insight and new approaches by attending this type of conference.
- Hobbyist conferences: If you are looking for emotional connections to people who share similar interests, this type of conference can be personally rewarding.
Ideally everyone would go to conferences which match their interests, but what happens when they don't match? Here are some examples:
- Vendor conferences: "Wow, this is boring. The booth staff don't even know about the technology they are selling. When will this be over?"
- Business conferences: "People are very passionate about the problems they are trying to solve. I am glad I don't have these problems — they seem unsolvable."
- Community conferences: "These people really care about the minutia of the software. When are they going to get a life?"
- Hobbyist conferences: "Does this end with everyone sitting in a circle and roasting marshmallows over a CPU fan?"