Tuesday, September 4, 2012

Conspiracy Against Nigeria and the Threat of Boko Haram

Prof. Yahaya Speaks on Phenomenom of Boko Haram in Nigeria Print E-mail

Professor Dahiru Yahaya speaks on the Phenomenon of Boko Haram in Nigeria. The Professor of Hisotry from Bayero University Kano, spoke...
Professor Dahiru Yahaya speaks on the Phenomenon of Boko Haram in Nigeria. Professor Dahiru Yahaya of History department Bayero University Kano, spokeon the Phenomenon of Boko Haram in Nigeria on Sunday the 16th of October 2011 during the 7th National Conference of Resource Forum of the Islamic Movement in Nigeria at Baqiyatullah Husainiyyah, Zaria.
  
The Professor spoke on the topic  ‘Phenomenum of Boko Haram in Nigeria’,  where he stated two issues regarding the Boko Haram, they are:
1.      - Conspiracy against Nigeria
2.       -Boko Haram
Conspiracy against Nigeria, he said, is a deliberate design against individual or entity to bring it down to the benefit of the perpetrators, and it started since colonial era when the British sent explorers to find way to exploit resources after the Jihad of Dan fodio.

When the colonialists came to Nigeria the first conspiracy was to convert Muslims to Christianity thorough the use of missionaries like Miller who was trained and taught Hausa language. They built churches everywhere with taxes collected in the Muslim lands. However, they realized that Muslims cannot be converted to Christianity and only non-Muslims accepted the new faith.
The colonialist later opted to the second conspiracy which was to separate Muslims from Islam, since the Muslims don’t accept Christianity. Thus, they wanted to produce a secular Muslim bereft of Islamic values, Qur’an and the holy Prophet(AS). They achieved this by giving 3 type of education in Nigeria. They gave a type of education which teaches love of Christianity to South especially the Igbo; the Yoruba land was given Western Education while North was given a secular one.
So, a secularist can work against the interest of Islam and Muslims and these are type of people produced in the North. He loves non-Muslims more than Muslim fellows.
The Professor explained further that there is another class of Muslims who got education in a traditional method but they don’t know their world.
Another conspiracy of the colonial masters was to produce yet another class of people who are taught to hate Islam, and the hope of this class of people is going down as their backbone, where they are getting support,  is going down.
In all the cases the motive of the conspiracy  is domination; to dominate mind and resources. However the trend cannot continue as every one is aware of his resources and how to control it, the professor noted.
Book Haram.
Resisting oppression is in Islam as established by Ahlul Bayt(AS) particularly by Imam Husain(AS). Mental or intellectual hijrah is also established matter in Islam. We are in a situation in Nigeria where Halal and Haram has no dividing line. Thus,  methods of checking oppression are employed by different scholars at different time. And the methods differ for each era and time. However, Boko haram subscribed to Islam without subscribing to time and changes.
For example Sheikh Bn Fodio gave a charter of Islamic governance where he mentioned issues regarding the life of individuals. This charter is now used by UN body. They include 1. Protecting life of individual 2. Human right and personal dignity which includes Right to exist, right to own property, right of assembly etc. however, in Nigeria even the life of citizen is not protected. So Nigerians are pushed to wall, only the poor are taxed while rich are not. Everybody knows that things are not right in Nigeria.
Now, methods of checking these excess varies from group to group. When others prefer to die than to live in the condition, others subscribed to other methods. This is the beginning of Boko Haram, and many groups are coming up, explained the Professor.
Everybody can become a book haram if not because of Islamic Movement in Nigeria; any body who has no meaning to live in Nigeria can become Boko Haram. So when people don’t have meaning in their lives, they become criminals.
He concluded by saying that Boko Haram is a result of bad governance where people don’t have meaning in their lives except frustration.

Monday, April 23, 2012

Islam and Depression

Have you ever SAD? What is the Cause of Your Sadness? Must you even Sad?

Find more details in the following MESSAGE
“What does Islam say regarding self-hatred and self-harm? Does Islam condemn situational depression? What about clinical depression? What about depression over our limited human knowledge -our inability to fully understand everything- is it a trust issue with Allah?”

Actually this is a very interesting issue, because according to psychological studies, a considerable percentage of people alive today are subject to some kind of depression, even small children (http://www.depression-guide.com/depression-statistics.htm ), so it is important to explore this issue in relation to being better Muslims.

The Islamic system aims to create balance in a Muslim’s life, by putting life matters into perspective, rearranging priorities accordingly, and harmonizing all circles of relationships between the individual and his inner and outer environments:

“Seek the life to come by means of what God granted you, but do not neglect your rightful share in this world. Do good to others as God has done good to you. Do not seek to spread corruption in the land, for God does not love those who do this” (Quran, 28:77)

People feel depressed or sad when this harmonious emotional and hormonal equilibrium is disturbed, in which case Islam steps in, not to condemn the feeling, but to offer a solution for regaining psychological and mental balance.

What is Depression?
There is a difference between situational depression (temporary deep distress or sadness) and clinical depression, which is a mental health disorder that can affect the way you work, study, sleep, eat, and enjoy pleasurable activities http://mentalhealth.about.com/od/depression/a/depression1.htm . A depressive disorder is more than a passing mood. It is not a sign of personal weakness, and it cannot be willed or wished away, because it’s a change in the chemicals of the brain (neuro-chemistry) which trigger a certain mood, and it needs professional help for treatment.

What causes Depression?
The causes of depression are numerous: genetic, psychological, and environmental factors are often involved. Yet, the relation between the chemistry of your brain and your experience in life is a two-way street: true your brain affects how you handle your life situations, but also the way you solve your problems and handle challenges greatly affect the mood-chemistry of your brain. So people who have low self-esteem, who are consistently pessimistic, who are readily overwhelmed by stress, or who have a severe physical illness are prone to depression.

Can a Muslim be Depressed?
To become Muslim, you submit your will to God alone and no one else, and you believe and trust that He will take good care of you, no matter what happens, as long as you keep your side of the relationship with Him. You admit your limitations as a human, so you go through life looking ahead positively, worrying only about what’s in your knowledge and ability as a human, and you leave the rest to God’s wisdom.

Existential concerns can cause serious distress as one tries to understand: why am I here, where am I going, what’s the point of living if I’m going to die anyway? As a Muslim, you get affected by life’s troubles and disturbing thoughts like everyone else, but you’re well equipped to deal with them because you have a clear roadmap of where you came from, where you’re going and why, so you have a head-start having this fundamental knowledge from its source. In other words, you’re resistant to existential emptiness, your focus is on taking control of your life to make the most of it according to the purpose it was given to you for, and you make decisions that won't cause you to feel worse in bad times.

Someone who feels completely lost and alone in the face of a crisis would be hopeless, helpless and depressed, but someone who constantly feels supported by a compassionate God who genuinely cares, who listens to desperate pleas, and who grants generous help, has a better chance of getting back on track much faster because there is a strong helping hand to reach for while dealing with life’s troubles.

Depression is not condemned in Islam:
Islam doesn’t require us to be superhuman. If one experiences negative feelings, he is encouraged to resist them with positive thoughts and actions if possible, or to seek professional help if the case is clinical, exactly like any other form of illness.

We’re required to take charge of our lives since we’re accountable for our deeds and decisions, both for ourselves and for others who will be affected. We’re not allowed to hate or harm ourselves; instead we’re taught dignity, self respect and protection; both as a right and a duty:

“And make not your own hands contribute to your destruction; but do good; for Allah loves those who do good.” (Quran, 2:195)

“Nor kill or destroy yourselves: for verily Allah has been to you Most Merciful!” (Quran, 4:29)

Self hatred results from low self esteem in reaction to feelings of worthlessness, hopelessness, or guilt. A Muslim feels dignified and honored because The Creator bestowed upon him special privileges:

“We have honored the children of Adam and carried them by land and sea. We have provided good sustenance for them and favored them specially above many of those We have created” (Quran, 17:70)

And even if you’ve committed the worst sins, you always have hope of God’s mercy:

“And never give up hope of Allah's soothing Mercy: truly no one despairs of Allah's soothing Mercy, except those who have no faith.” (Quran, 12:87)

There is no place for despair because you have confidence in knowing that it’s God Himself who is in charge of everything, the All Seeing, All Knowing, and All Fair and Wise God:

“And for those who fear Allah, He always prepares a way out, and He provides for him from sources he never could imagine. And if anyone puts his trust in Allah, sufficient is Allah for him. For Allah will surely accomplish His purpose: verily, for all things has Allah appointed a due proportion.” (Quran, 65: 2-3)

You’re certain there is no impossible situation which has no solution:

“So, verily, with every difficulty, there is relief: Verily, with every difficulty there is relief.” (Quran, 94: 5-6)

You also have a simple and effective prescription against transient grief and anxiety:

(O Allah, I am Your slave, son of Your slave, son of Your female slave, my forelock is in Your hand, Your command over me is forever executed and Your decree over me is just. I ask You by every Name belonging to You which You named Yourself with, or revealed in Your Book, or You taught to any of Your creation, or You have preserved in the knowledge of the unseen with You, that You make the Qur’an the life of my heart and the light of my breast, and a departure for my sorrow and a release for my anxiety)’

The Prophet Muhammad (peace be upon him) said: “No person suffers any anxiety or grief, and says (this supplication) but Allah will take away his sorrow and grief, and give him in their stead joy.”

Thursday, March 22, 2012

COMPUTER Cybernetics

 

CYBERNETICS — A Definition

Artificial Intelligence and cybernetics: Aren't they the same thing? Or, isn't one about computers and the other about robots? The answer to these questions is emphatically, No.
Researchers in Artificial Intelligence (AI) use computer technology to build intelligent machines; they consider implementation (that is, working examples) as the most important result. Practitioners of cybernetics use models of organizations, feedback, goals, and conversation to understand the capacity and limits of any system (technological, biological, or social); they consider powerful descriptions as the most important result.
The field of AI first flourished in the 1960s as the concept of universal computation [Minsky 1967], the cultural view of the brain as a computer, and the availability of digital computing machines came together to paint a future where computers were at least as smart as humans. The field of cybernetics came into being in the late 1940s when concepts of information, feedback, and regulation [Wiener 1948] were generalized from specific applications in engineering to systems in general, including systems of living organisms, abstract intelligent processes, and language.

Origins of "cybernetics"

The term itself began its rise to popularity in 1947 when Norbert Wiener used it to name a discipline apart from, but touching upon, such established disciplines as electrical engineering, mathematics, biology, neurophysiology, anthropology, and psychology. Wiener, Arturo Rosenblueth, and Julian Bigelow needed a name for their new discipline, and they adapted a Greek word meaning "the art of steering" to evoke the rich interaction of goals, predictions, actions, feedback, and response in systems of all kinds (the term "governor" derives from the same root) [Wiener 1948]. Early applications in the control of physical systems (aiming artillery, designing electrical circuits, and maneuvering simple robots) clarified the fundamental roles of these concepts in engineering; but the relevance to social systems and the softer sciences was also clear from the start. Many researchers from the 1940s through 1960 worked solidly within the tradition of cybernetics without necessarily using the term, some likely (R. Buckminster Fuller) but many less obviously (Gregory Bateson, Margaret Mead).

Limits to knowing

In working to derive functional models common to all systems, early cybernetic researchers quickly realized that their "science of observed systems" cannot be divorced from "a science of observing systems" — because it is we who observe [von Foerster 1974]. The cybernetic approach is centrally concerned with this unavoidable limitation of what we can know: our own subjectivity. In this way cybernetics is aptly called "applied epistemology". At minimum, its utility is the production of useful descriptions, and, specifically, descriptions that include the observer in the description. The shift of interest in cybernetics from "observed systems" — physical systems such as thermostats or complex auto-pilots — to "observing systems" — language-oriented systems such as science or social systems — explicitly incorporates the observer into the description, while maintaining a foundation in feedback, goals, and information. It applies the cybernetic frame to the process of cybernetics itself. This shift is often characterized as a transition from 'first-order cybernetics' to 'second-order cybernetics. Cybernetic descriptions of psychology, language, arts, performance, or intelligence (to name a few) may be quite different from more conventional, hard "scientific" views — although cybernetics can be rigorous too. Implementation may then follow in software and/or hardware, or in the design of social, managerial, and other classes of interpersonal systems.

Origins of AI in cybernetics

Ironically but logically, AI and cybernetics have each gone in and out of fashion and influence in the search for machine intelligence. Cybernetics started in advance of AI, but AI dominated between 1960 and 1985, when repeated failures to achieve its claim of building "intelligent machines" finally caught up with it. These difficulties in AI led to renewed search for solutions that mirror prior approaches of cybernetics. Warren McCulloch and Walter Pitts were the first to propose a synthesis of neurophysiology and logic that tied the capabilities of brains to the limits of Turing computability [McCulloch & Pitts 1965]. The euphoria that followed spawned the field of AI [Lettvin 1989] along with early work on computation in neural nets, or, as then called, perceptrons. However the fashion of symbolic computing rose to squelch perceptron research in the 1960s, followed by its resurgence in the late 1980s. However this is not to say that current fashion in neural nets is a return to where cybernetics has been. Much of the modern work in neural nets rests in the philosophical tradition of AI and not that of cybernetics.

Philosophy of cybernetics

AI is predicated on the presumption that knowledge is a commodity that can be stored inside of a machine, and that the application of such stored knowledge to the real world constitutes intelligence [Minsky 1968]. Only within such a "realist" view of the world can, for example, semantic networks and rule-based expert systems appear to be a route to intelligent machines. Cybernetics in contrast has evolved from a "constructivist" view of the world [von Glasersfeld 1987] where objectivity derives from shared agreement about meaning, and where information (or intelligence for that matter) is an attribute of an interaction rather than a commodity stored in a computer [Winograd & Flores 1986]. These differences are not merely semantic in character, but rather determine fundamentally the source and direction of research performed from a cybernetic, versus an AI, stance.
Underlying philosophical differences between AI and cybernetics are displayed by showing how they each construe the terms in the central column. For example, the concept of "representation" is understood quite differently in the two fields. Relations on the left are causal arrows and reflect the reductionist reasoning inherent in AI's "realist" perspective that via our nervous systems we discover the-world-as-it-is. Relations on the right are non-hierarchical and circular to reflect a "constructivist" perspective, where the world is invented (in contrast to being discovered) by an intelligence acting in a social tradition and creating shared meaning via hermeneutic (circular, self-defining) processes. The implications of these differences are very great and touch on recent efforts to reproduce the brain [Hawkins 2004, IBM/EPFL 2004] which maintain roots in the paradigm of "brain as computer". These approaches hold the same limitations of digital symbolic computing and are neither likely to explain, nor to reproduce, the functioning of the nervous system.

Influences

Winograd and Flores credit the influence of Humberto Maturana, a biologist who recasts the concepts of "language" and "living system" with a cybernetic eye [Maturana & Varela 1988], in shifting their opinions away from the AI perspective. They quote Maturana: "Learning is not a process of accumulation of representations of the environment; it is a continuous process of transformation of behavior through continuous change in the capacity of the nervous system to synthesize it. Recall does not depend on the indefinite retention of a structural invariant that represents an entity (an idea, image or symbol), but on the functional ability of the system to create, when certain recurrent demands are given, a behavior that satisfies the recurrent demands or that the observer would class as a reenacting of a previous one." [Maturana 1980] Cybernetics has directly affected software for intelligent training, knowledge representation, cognitive modeling, computer-supported coöperative work, and neural modeling. Useful results have been demonstrated in all these areas. Like AI, however, cybernetics has not produced recognizable solutions to the machine intelligence problem, not at least for domains considered complex in the metrics of symbolic processing. Many beguiling artifacts have been produced with an appeal more familiar in an entertainment medium or to organic life than a piece of software [Pask 1971]. Meantime, in a repetition of history in the 1950s, the influence of cybernetics is felt throughout the hard and soft sciences, as well as in AI. This time however it is cybernetics' epistemological stance — that all human knowing is constrained by our perceptions and our beliefs, and hence is subjective — that is its contribution to these fields. We must continue to wait to see if cybernetics leads to breakthroughs in the construction of intelligent artifacts of the complexity of a nervous system, or a brain.

Cybernetics Today

The term "cybernetics" has been widely misunderstood, perhaps for two broad reasons. First, its identity and boundary are difficult to grasp. The nature of its concepts and the breadth of its applications, as described above, make it difficult for non-practitioners to form a clear concept of cybernetics. This holds even for professionals of all sorts, as cybernetics never became a popular discipline in its own right; rather, its concepts and viewpoints seeped into many other disciplines, from sociology and psychology to design methods and post-modern thought. Second, the advent of the prefix "cyb" or "cyber" as a referent to either robots ("cyborgs") or the Internet ("cyberspace") further diluted its meaning, to the point of serious confusion to everyone except the small number of cybernetic experts.
However, the concepts and origins of cybernetics have become of greater interest recently, especially since around the year 2000. Lack of success by AI to create intelligent machines has increased curiosity toward alternative views of what a brain does [Ashby 1960] and alternative views of the biology of cognition [Maturana 1970]. There is growing recognition of the value of a "science of subjectivity" that encompasses both objective and subjective interactions, including conversation [Pask 1976]. Designers are rediscovering the influence of cybernetics on the tradition of 20th-century design methods, and the need for rigorous models of goals, interaction, and system limitations for the successful development of complex products and services, such as those delivered via today's software networks. And, as in any social cycle, students of history reach back with minds more open than was possible at the inception of cybernetics, to reinterpret the meaning and contribution of a previous era.

Tuesday, February 21, 2012

Al-Hikmat College, Agege To Launch/Commission Her ICT Center

Globalization and technological change—processes that have accelerated in tandem over the past fifteen
years—have created a new global economy “powered by technology, fueled by information and
driven by knowledge.”1 The emergence of this new global economy has serious implications for the
nature and purpose of educational institutions. As the half-life of information continues to shrink and
access to information continues to grow exponentially, schools cannot remain mere venues for the
transmission of a prescribed set of information from teacher to student over a fixed period of time.
Rather, schools must promote “learning to learn,” : i.e., the acquisition of knowledge and skills that
make possible continuous learning over the lifetime.2 “The illiterate of the 21st century,” according to
futurist Alvin Toffler,“will not be those who cannot read and write, but those who cannot learn, unlearn,
and relearn.”
Concerns over educational relevance and quality coexist with the imperative of expanding educational
opportunities to those made most vulnerable by globalization—developing countries in general;
low-income groups, girls and women, and low-skilled workers in particular. Global changes also put
pressure on all groups to constantly acquire and apply new skills. The International Labour
Organization defines the requirements for education and training in the new global economy simply
as “Basic Education for All”,“Core Work Skills for All” and “Lifelong Learning for All”. 3
Information and communication technologies (ICTs)—which include radio and television, as well as
newer digital technologies such as computers and the Internet—have been touted as potentially powerful
enabling tools for educational change and reform.When used appropriately, different ICTs are
said to help expand access to education, strengthen the relevance of education to the increasingly digital
workplace, and raise educational quality by, among others, helping make teaching and learning
into an engaging, active process connected to real life.
However, the experience of introducing different ICTs in the classroom and other educational settings
all over the world over the past several decades suggests that the full realization of the potential educational
benefits of ICTs is not automatic.The effective integration of ICTs into the educational system
is a complex, multifaceted process that involves not just technology—indeed, given enough initial
capital, getting the technology is the easiest part!—but also curriculum and pedagogy, institutional
readiness, teacher competencies, and long-term financing, among others.
This primer is intended to help policymakers in developing countries define a framework for the
appropriate and effective use of ICTs in their educational systems by first providing a brief overview of
the potential benefits of ICT use in education and the ways by which different ICTs have been used in
education thus far. Second, it addresses the four broad issues in the use of ICTs in education—effectiveness,
cost, equity, and sustainability. The primer concludes with a discussion of five key challenges
that policymakers in developing countries must reckon with when making decisions about the integration
of ICTs in education,namely, educational policy and planning, infrastructure, capacity building,
language and content, and financing.


Friday, February 17, 2012

ICT minister

Click the link below

http://webtrendsng.com/blog/breaking-news-nigeria-in-search-of-ict-minister-can-you-recommend-one/