Grin logo
de en es fr
Boutique
GRIN Website
Publier des textes, profitez du service complet
Aller à la page d’accueil de la boutique › Informatique - L'informatique théorique

Perceptron Problem in Neural Network

Titre: Perceptron Problem in Neural Network

Travail d'étude , 2010 , 11 Pages

Autor:in: Dang Xuan Tho (Auteur)

Informatique - L'informatique théorique
Extrait & Résumé des informations   Lire l'ebook
Résumé Extrait Résumé des informations

Studienarbeit zum Thema: Perceptron Problem in Neural Network. Informatik - Theoretische Inf. auf englisch.

Extrait


Table of Contents

1. Perceptron Problem

Objectives and Topics

This document investigates the behavior and adjustment mechanisms of a perceptron neural network model, specifically focusing on the dynamics of weight vector updates in response to two-dimensional input data distributions.

  • Mathematical modeling of perceptron input/output relations.
  • Visualization of input data distribution and weight vector orientation.
  • Analysis of weight update rules under varying conditions.
  • Determination of optimal regions for connection weight vectors.
  • Influence of the learning rate parameter on convergence speed and accuracy.

Excerpt from the Book

Perceptron Problem

Consider a perceptron shown in Fig 9.1. The input data x = [x1, x2]T and the output y are related by v = w1x1 + w2x2 = wT x (9.1) y = ψ(v) = {1, v ≥ 0; 0, v < 0} (9.2). The input data x ∈ C1 and x ∈ C1 are distributed during 60 ~ 120 degrees and 225 ~340 degrees as shown in Fig 9.2. The data x is randomly sample in both classes, and applied to the perceptron. The connection weights are updated following d(n) = {1, x ∈ C1; 0, x ∈ C2} (9.3) e(n) = d(n) − y(n) (9.4) w(0) = [1,0]T (9.5) w(n + 1) = w(n) + ηe(n)x(n) (9.6) ‖w(n)‖ = 1 (9.7) w(n + 1) = w(n + 1) / ‖w(n + 1)‖ (9.8). The inner product of w and x and an angle θ, shown in Fig 9.3 between them are related by cos θ = wT x / ‖w‖‖x‖ (9.9). Please show the regions, where the final weight vector w can locate, in the following two cases. The regions should be indicated by an angle. a. η is large. b. η is very small. And answer some question 1. Obtain the region, where optimum connection weight can locate. 2. Direction of adjustment.

Summary of Chapters

Perceptron Problem: This chapter introduces the basic mathematical framework of a perceptron, detailing how input potentials and weight vectors interact to determine output classifications within specific angular data distributions.

Keywords

Perceptron, Neural Network, Connection Weights, Weight Vector, Input Potential, Learning Rate, Vector Adjustment, Data Distribution, Convergence, Optimization, Inner Product, Classification, Feature Space, Vector Rotation, Adaptive Systems.

Frequently Asked Questions

What is the primary focus of this work?

The work provides a technical analysis of perceptron learning dynamics, specifically examining how connection weights are adjusted when classifying input data distributed in two-dimensional space.

What are the core thematic areas?

The core themes include mathematical modeling of neural nodes, geometric interpretation of weight updates, convergence behavior of perceptrons, and the optimization of weight vectors.

What is the central research question?

The text aims to determine the specific regions where an optimal connection weight vector can reside and how the perceptron's internal parameters influence the direction and magnitude of weight adjustments.

Which scientific method is employed?

The study utilizes analytical modeling and geometric derivation, substituting equations representing input potentials and classification rules to track the trajectory of the weight vector.

What is addressed in the main part?

The main part covers the systematic derivation of update rules (Eq. 9.1 through 9.39) and maps these to specific geometric rotations (clockwise/counter-clockwise) based on the distribution of input data classes.

What characterize the work?

The work is characterized by its focus on the mathematical foundations of the Perceptron, using clear step-by-step algebraic substitutions and figure-based visual representations.

How does the learning rate η affect the model?

A large learning rate allows for quick movement towards the optimal weight vector but may prevent exact stopping at the target destination, whereas a very small learning rate ensures a slower, more precise approach.

Why is the final region restricted to 70 to 135 degrees?

The region is bounded by the constraints of the classification conditions (Eq. 9.40a and 9.40b), ensuring that the inner product with input data correctly maps to the required output for classes C1 and C2.

Fin de l'extrait de 11 pages  - haut de page

Résumé des informations

Titre
Perceptron Problem in Neural Network
Auteur
Dang Xuan Tho (Auteur)
Année de publication
2010
Pages
11
N° de catalogue
V153037
ISBN (ebook)
9783640648870
ISBN (Livre)
9783640648955
Langue
anglais
mots-clé
Perceptron Problem Neural Network
Sécurité des produits
GRIN Publishing GmbH
Citation du texte
Dang Xuan Tho (Auteur), 2010, Perceptron Problem in Neural Network, Munich, GRIN Verlag, https://www.grin.com/document/153037
Lire l'ebook
  • Si vous voyez ce message, l'image n'a pas pu être chargée et affichée.
  • Si vous voyez ce message, l'image n'a pas pu être chargée et affichée.
  • Si vous voyez ce message, l'image n'a pas pu être chargée et affichée.
  • Si vous voyez ce message, l'image n'a pas pu être chargée et affichée.
  • Si vous voyez ce message, l'image n'a pas pu être chargée et affichée.
  • Si vous voyez ce message, l'image n'a pas pu être chargée et affichée.
Extrait de  11  pages
Grin logo
  • Grin.com
  • Expédition
  • Contact
  • Prot. des données
  • CGV
  • Imprint