An artificial neural network is basically an ensemble of neurons connected by weighted synaptic connections allowing superior computing performances over classical von Neumann-based systems in processing cognitive and data intensive tasks, such as such as real-time image recognition, data classification or natural language processing, to cite a few. Typically, the information represented by a weight for each synapse is transmitted from the pre-synaptic neuron to the post-synaptic neuron. The network is then trained by updating its synaptic weights to perform a specific task. In the race for efficient materials to emulate neuronal and synaptic functions, classical ferroelectrics (oxides and PVDF-based polymers) are good candidate materials. Here, we take advantage of the polar instabilities and the flat energy landscape characteristic of relaxors to exploit this special class of ferroelectrics for mimicking neuromorphic elements. We show that field-induced transitions are useful for tuning the capacitance and polar responses along multiple states and in a non-volatile manner to reproduce memristor and memcapcitor behaviors. We use such component to emulate some fundamental learning rules including short-term and long-term memory, spike-timing- and spike-rate-dependent-plasticity, etc. These findings may open a new field of research dedicated to employ relaxors for the design of neuromorphic architecture and computing.