Sunday, January 26, 2020

Experiment for Plant Recognition

Experiment for Plant Recognition Abstract In classical sparse representation based classification (SRC) and weighted SRC (WSRC) algorithms, the test samples are sparely represented by all training samples. They emphasize the sparsity of the coding coefficients but without considering the local structure of the input data. Although the more training samples, the better the sparse representation, it is time consuming to find a global sparse representation for the test sample on the large-scale database. To overcome the shortcoming, aiming at the difficult problem of plant leaf recognition on the large-scale database, a two-stage local similarity based classification learning (LSCL) method is proposed by combining local mean-based classification (LMC) method and local WSRC (LWSRC). In the first stage, LMC is applied to coarsely classifying the test sample. k nearest neighbors of the test sample, as a neighbor subset, is selected from each training class, then the local geometric center of each class is calculated. S candidate n eighbor subsets of the test sample are determined with the first S smallest distances between the test sample and each local geometric center. In the second stage, LWSRC is proposed to approximately represent the test sample through a linear weighted sum of all kÃÆ'-S samples of the S candidate neighbor subsets. The rationale of the proposed method is as follows: (1) the first stage aims to eliminate the training samples that are far from the test sample and assume that these samples have no effects on the ultimate classification decision, then select the candidate neighbor subsets of the test sample. Thus the classification problem becomes simple with fewer subsets; (2) the second stage pays more attention to those training samples of the candidate neighbor subsets in weighted representing the test sample. This is helpful to accurately represent the test sample. Experimental results on the leaf image database demonstrate that the proposed method not only has a high accuracy and lo w time cost, but also can be clearly interpreted. Keywords: Local similarity-based-classification learning (LSCL); Local mean-based classification method (LMC); Weighted sparse representation based classification (WSRC); Local WSRC (LWSRC); Two-stage LSCL. 1. Introduction Similarity-based-classification learning (SCL) methods make use of the pair-wise similarities or dissimilarities between a test sample and each training sample to design the classification problem. K-nearest neighbor (K-NN) is a non-parametric, simple, attractive, relatively mature pattern SCL method, and is easy to be quickly achieved [1,2]. It has been widely applied to many applications, including computer vision, pattern recognition and machine learning [3,4]. Its basic processes are: calculating the distance (as dissimilarity or similarity) between the test sample y and each training sample, selecting k samples with k minimum distances as the nearest k neighbors of y, finally determining the category of y that most of the nearest k neighbors belong to. In weighted K-NN, it is useful to assign weight to the contributions of the neighbors, so that the nearer neighbors contribute more to the classification method than the more dissimilarity ones. One of the disadvantages of K-NN is that, when the distribution of the training set is uneven, K-NN may cause misjudgment, because K-NN only cares the order of the first k nearest neighbor samples but does not consider the sample density. Moreover, the performance of K-NN is seriously influenced by the existing outliers and noise samples. To overcome these problems, a number of local SCL (LSCL) methods have been proposed recently. The local mean-based nonparametric classifier (LMC) is said to be an improved K-NN, which can resist the noise influences and classify the unbalanced data [5,6]. Its main idea is to calculate the local mean-based vector of each class as the nearest k neighbor of the test sample, and the test sample can be classified into the category that the nearest local mean-based vector belongs to. One disadvantage of LMC is that it cannot well represent the similarity between multidimensional vectors. To improve the performance of LMC, Mitani et al. [5] proposed a reliable local mean-based K-NN algorit hm (LMKNN), which employs the local mean vector of each class to classify the test sample. LMKNN has been already successfully applied to the group-based classification, discriminant analysis and distance metric learning. Zhang et al. [6] further improved the performance of LMC by utilizing the cosine distance instead of Euclidean distance to select the k nearest neighbors. It is proved to be better suitable for the classification of multidimensional data. Above SCL, LMC and LSCL algorithms are often not effective when the data patterns of different classes overlap in the regions in feature space. Recently, sparse representation based classification (SRC) [8], a SCL modified manner, has attracted much attention in various areas. It can achieve better classification performance than other typical clustering and classification methods such as SCL, LSCL, linear discriminant analysis (LDA) and principal component analysis (PCA) [7] in some cases. In SRC [9], a test image is encoded over the original training set with sparse constraint imposed on the encoding vector. The training set acts as a dictionary to linearly represent the test samples. SRC emphasizes the sparsity of the coding coefficients but without considering the local structure of the input data [10,11]. However, the local structure of the data is proven to be important for the classification tasks. To make use of the local structure of the data, some weighted SRC (WSRC) and lo cal SCR (LSRC) algorithms have been proposed. Guo et al. [12] proposed a similarity WSRC algorithm, in which, the similarity matrix between the test samples and the training samples can be constructed by various distance or similarity measurements. Lu et al. [13] proposed a WSRC algorithm to represent the test sample by exploiting the weighted training samples based on l1-norm. Li et al. [14] proposed a LSRC algorithm to perform the sparse decomposition in local neighborhood. In LSRC, instead of solving the l1-norm constrained least square problem for all of training samples, they solved a similar problem in the local neighborhood of each test sample. SRC, WSRC, similarity WSRC and LSRChave something in common, such as, the individual sparsity and local similarity between the test sample and the training samples are considered to ensure that the neighbor coding vectors are similar to each other if they have strong correlation, and the weighted matrix is constructed by incorporating the similarity information, the similarity weighted l1-norm minimization problem is constructed and solved, and the obtained coding coefficients tend to be local and robust. Leaf based plant species recognition is one of the most important branches in pattern recognition and artificial intelligence [15-18]. It is useful for agricultural producers, botanists, industrialists, food engineers and physicians, but it is a NP-hard problem and a challenging research [19-21], because plant leaves are quite irregular, it is difficult to accurately describe their shapes compared with the industrial work pieces, and some between-species leaves are different from each other, as shown in Fig1.A and B, while within-species leaves are similar to each other, as shown in Fig.1C [22]. test sample training 1 training 2 training 3 training 4 training 5 training 6 training 7 (A) Four different species leaves (B) Four different species leaves (C) Ten same species leaves Fig.1 plant leaf examples SRC can be applied to leaf based plant species recognition [23,24]. In theory, in SRC and modified SRC, it is well to sparsely represent the test sample by too many training samples. In practice, however, it is time consuming to find a global sparse representation on the large-scale leaf image database, because leaf images are quite complex than face images. To overcome this problem, in the paper, motivated by the recent progress and success in LMC [6], modified SRC [12-14], two-stage SR [25] and SR based coarse-to-fine face recognition [26], by creatively integrating LMC and WSRC into the leaf classification, a novel plant recognition method is proposed and verified on the large-scale dataset. Different from the classical plant classification methods and the modified SRC algorithms, in the proposed method, the plant species recognition is implemented through a coarse recognition process and a fine recognition process. The major contributions of the proposed method are (1) a two-stage plant species recognition method, for the first time, is proposed; (2) a local WSRC algorithm is proposed to sparsely represent the test sample; (3) the experimental results indicate that the proposed method is very competitive in plant species recognition on large-scale database. The remainder of this paper is arranged as follows: in Section 2, we briefly review LMC, SRC and WSRC. In Section 3, we describe the proposed method and provide some rationale and interpretation. Section 4 presents experimental results. Section 5 offers conclusion and future work. 2. Related works In this section, some related works are introduced. Suppose n training samples,, from different classes {X1, X2,à ¢Ã¢â€š ¬Ã‚ ¦,XC}. is the sample number of the ith class, then. 2.1 LMC Local mean-based nonparametric classification (LMC) is an improved K-NN method [6]. It uses Euclidean distance or cosine distance to select nearest neighbors and measure the similarity between the test sample and its neighbors. In general, the cosine distance is more suitable to describe the similarity of the multi-dimensional data. LMC is described as follows, for each test sample y, Step 1: Select k nearest neighbors of y from the jth class, as a neighbor subset represented by; Step 2: Calculate the local mean-based vector for each classby, (1) Step 3: Calculate the distance between y and. Step 4: if Euclidean distance metric is adopted; while if cosine distance metric is adopted. 2.2 SRC SRC relies on a distance metric to penalize the dissimilar samples and award the similar samples. Its main idea is to sparsely represent and classify the test sample by a linear combination of all the training samples. The test sample is assigned into the class that produces the minimum residue. SRC is described as follows, Input: n training samples, a test sample. Output: the class label of y. Step 1: Construct the dictionary matrixby n training samples. Each column of A is a training sample called basis vector or atom. Normalize each column of A to unit l2-norm. A is required to be unit l2-norm (or bounded norm) in order to avoid the trivial solutions that are due to the ambiguity of the linear reconstruction. Step 2: Construct and solve an l1-norm minimization problem, (2) where x is called as spare representation coefficients of y. Eq. (2) can be usually approximate by an l1-norm minimization problem, (3) whereis the threshold of the residue. Eq.(3) can be generalized as a constrained least square problem, (4) where ÃŽÂ »>0 is a scalar regularization parameter which balances the tradeoff between the sparsity of the solution and the reconstruction error. Eq.(4) is a constrained LASSO problem, its detail solution is found in Ref. [27]. Step 3: Compute residue, whereis the characteristic function that selects the coefficients associated with the ith class; Step 4: the class label of, y, is identified as. 2.3 WSRC WSRC integrates both sparsity and locality structure of the data to further improve the classification performance of SRC. It aims to impose larger weight to the training samples that are farer from the test sample. Different from SRC, WSRC solves a weighted l1-norm minimization problem, (5) where W is a diagonal weighted matrix, and its diagonal elements are. Eq.(5) makes sure that the coding coefficients of WSRC tend to be not only sparse but also local in linear representation [13], which can represent the test sample more robustly. 2.4 LSRC Though a lot of instances have been reported that WSRC performs better than SRC in various classification problems, WSRC forms the dictionary by using all the training samples, thus the size of the generated dictionary may be large, which will make adverse effect to solving the l1-norm minimization problem. To overcome this drawback, a local sparse representation based classification (LSRC) is proposed to perform sparse decomposition in a local manner. In LSRC, K-NN criterion is exploited to find the nearest k neighbors for the test samples, and the selected samples are utilized to construct the over-complete dictionary. Different from SRC, LSRC solves a weighted l1 minimization problem, (6) wherestands for data matrix which consists of the k nearest neighbors of y. Compared with the original SRC and WSRC, although the computational cost of LSRC will be saved remarkably when, LSRC does not assign different weight to the different training samples. 3. Two-stage LSCL From the above analysis, it is found that each of LMC, WSRC and LSRC has its advantages and disadvantages. To overcome the difficult problem of plant recognition on the large-scale leaf image database, a two-stage LSCL leaf recognition method is proposed in the section. It is a sparse decomposition problem in a local manner to obtain an approximate solution. Compared with WSRC and LSRC, LSCL solves a weighted l1-norm constrain least square problem in the candidate local neighborhoods of each test sample, instead of solving the same problem for all the training samples. Suppose there are a test sampleand n training samples from C classes, andis the sample number of ith class,is jth sample of the ith class. Each sample is assumed to be a one-dimensional column vector. The proposed method is described in detail as follows. 3.1 First stage of LSCL Calculate the Euclidean distancebetween y and, and select k nearest neighbors of y fromwith the first k smallest distances, the selected neighbor subset noted as, . Calculate the average of, (7) Calculate the Euclidean distancebetween y and. From C neighbor subsets, selectneighbor subsets with the firstsmallest distancesas the candidate subsets for the test sample, in simple terms, denoted as. The training samples fromare reserved as the candidate training samples for the test sample, and the other training samples are eliminated from the training set. 3.2 Second step of LSCL From the first stage, it is noted that there aretraining samples from all the candidate subsets. For simplify, we just as well express the jth training sample ofis. The second stage first represents the test sample as a linear combination of all the training samples of, and then exploits this linear combination to classify the test sample. From the first stage, we have obtained the Euclidean distancebetween y and each candidate sample. By, a new local WSRC is proposed to solve the same weighted l1-norm minimization problem as Eq.(5), (8) where is the dictionary constructed bytraining samples of,is the weighted diagonal matrix, is the Euclidean distance between y and. In Eq.(8), the weighted matrix is a locality adaptor to penalize the distance between y and. In the above SRC, WSRC, LSRC and LSCL, the l1à ¢Ã‹â€ Ã¢â‚¬â„¢norm constraint least square minimization problem is solved by the approach proposed in [28], which is a specialized interior-point method for solving the large scale problem. The solution of Eq.(8) can be expressed as (9) From Eq.(9), is expressed as the sparse representation of the test sample. In representing the test sample, the sum of the contribution of the ith candidate neighbor subset is calculated by (10) whereis the jth sparse coefficient corresponding to the ith candidate nearest neighbor subset. Then we calculate the residue of the ith candidate neighbor subset corresponding to test sample y, (11) In Eq.(11), for the ith class (), a smalleraverages the greater contribution to representing y. Thus, y is finally classified into the class that produces the smallest residue. 3.3 Summary of two-stage LSCL From the above analysis, the main steps of the proposed method are summarized as follows. Suppose n training samples from Cdifferent classes, a test sample y, the number k of the nearest neighbors of y, the number S of the candidate neighbor subsets. Step 1. Compute the Euclidean distance between the test sample y and every training sample, respectively. Step 2. Through K-NN rules, find k nearest neighbors from each training class as the neighbor subset for y, calculate the neighbor average of the neighbor subset of each class, and calculate the distance between y and the neighbor average. Step 3. Determine S neighbor subsets with the first S smallest distances, as the candidate neighbor subsets for y. Step 4. Construct the dictionary by all training samples of the S candidate neighbor subsets and then construct the weighted l1-norm minimization optimization problem as Eq.(8). Step 5. Solve Eq.(8) and obtain the sparse coefficients. Step 6. For each candidate neighbor subset, compute the residue between yand its estimationby Eq.(11). Step 7. Identify the class labelthat has the minimum ultimate residue and classify y into this class. 3.4 Rationale and interpretation of LSCL In practical, some between-species leaves are very different from the other leaves, as shown in Fig.1A. They can be easily classified by the Euclidean distances between the leaf digital image matrices. However, some between-species leaves are very similar to each other, as shown in Fig.1B. They cannot be easily classified by some simple classification methods. In Figs.1A and B, suppose the first leaf is the test sample, while other seven leaves are training samples. It is difficult to identify the label of the test leaf by the simple classification method, because the test leaf is very similar to Nos. 4,5,6 and 7 in Fig.1B. However, it is sure that the test sample is not Nos.1, 2 and 3. So, we can naturally firstly exclude these three leaves. This exclusion method example is the purpose of the first stage of LSCL. From Fig.1C, it is found that there is large difference between the leaves of the same species. Therefore, in plant recognition, an optimal scheme is to select some trainin g samples that are relatively similar to the test sample as the candidate training samples, such as Nos. 2 and 9 in Fig.1C are similar to the test sample in Fig.1C, instead of considering all training samples. The average neighbor distance is used to coarsely recognize the test sample. The average neighbor distance as dissimilarity is more effective and robust than the original distance between the test and each training leaf, especially in the case of existing noise and outliers. From the above analysis, in the first stage of LSCL, it is reasonable to assume that the leaf close to the test sample has great effect, on the contrary, if a leaf is far enough from the test sample it will have little effect and even have side-effect on the classification decision of the test sample. These leaves should be discarded firstly, and then the later plant recognition task will be clear and simple. In the same way, we can use the similarity between the test sample and the average of its nearest neighbors to select some neighbor subsets as the candidate training subsets of the test sample. If we do so, we can eliminate the side-effect on the classification decision of the neighbor subset that is far from the test sample. Usually, for the classification problem, the more the classes, the lower the classification accuracy, so the first stage is very useful. In the second stage of LSCL, there are S nearest neighbor subsets as candidate class labels of the test sample, thus it is indeed faced with a problem simpler than the original classification problem, becauseand, i.e., few training samples are reserved to match the test sample. Thus, the computational cost is mostly reduced and the recognition rate will be improved greatly. We analyze the computational cost of LSCL in theory as follows. There are n samples from C classes, and every sample is an mÃÆ'-1 column vector, the first stage need to calculate the Euclidean distance, select k nearest neighbors from each class, and calculate the average of the k nearest neighbors, then the computational cost is about. In second stage, there aretraining samples to construct the dictionary A, the cost ofis, the cost ofis, and the cost ofis. The second stage has computational cost of+. The computational cost of LSCL is ++in total. The computational cost of the classical SRC algorithm is[8,9]. Compared with SRC, it is found that the computational cost of LSCL will be saved remarkably when. 4. Experiments and result analysis In this section, the proposed method is validated on a plant species leaf database and compared with the state-of-the-art methods. 4.1 Leaf image data and experiment preparation To validate the proposed method, we apply it to the leaf classification task using the ICL dataset. All leaf images of the dataset were collected at the Botanical Garden of Hefei, Anhui Province of China by Intelligent Computing Laboratory (ICL), Chinese Academy of Sciences. The ICL dataset contains 6000 plant leaf images from 200 species, in which each class has 30 leaf images. Some examples are shown in Fig.2. In the database, some leaves could be distinguished easily, such as the first 6 leaves in Fig.2A, while some leaves could be distinguished difficultly, such as the last 6 leaves in Fig.2A. We verify the proposed method by two situations, (1) two-fold cross validation, i.e., 15 leaf images of each class are randomly selected for training, and the rest 15 samples are used for testing; (2) leave-one-out cross validation, i.e., one of each class are randomly selected for testing and the rest 29 leaf images per class are used for training. (A) Original leaf images (B) Gray-scale images (C) Binary texture images Fig.2 Samples of different species from ICL database

Friday, January 17, 2020

How African American Humor has Evolved and the Way We Look at Comedy

Professor Jim Gray of Sonoma State University defines culture as a means of survival. Going by this definition of culture the evolution of black humor has definitely been a foundation in the survival of the comedy in America. This paper will be a discussion of how African American Humor has evolved and for centuries has changed and continues to change the way we look at comedy. Before beginning this paper, I must stress the importance of humor for all races. Truly, the environment in which most humor takes place has helped American culture and people survive. According to Constance Rourke, humor is important because: â€Å"1. Humor is a part of the natural life process and is commonly taken for granted or not recognized as having serious importance. The fact that humor is a framework for `non-real' or `play' activity and not taken as a `serious' interaction allows messages and formulations to be `risked' within its framework which would not otherwise be acceptable or possible. 3. Humor allows the exploration of new ideas in situations of uncertainty or unfamiliarity. Similarly allowed are the negotiation of taboo topics, sensitive issues, and marginal serious content. 4. Humor performs a boundary function on both internal and external lines, policing groups in terms of membership and acceptable and competence behavior. 5. Humor can function as a coping device to release tension, allay fear, forestall threat, defuse aggression or distance the unpleasant. 6. Humor can represent an implicit contradiction, paradox or `joke in the social structure' made explicit. The `joke' constitutes a reversal within its boundaries of the patterns of control in the real world. 7. `Canned' jokes and `situational' jokes are not entirely separate. Canned jokes are not sealed from the situation in which they are told as they always affect it and incorporate interaction into their pattern; situation jokes always have some impact beyond their context. Langston Hughes says, â€Å"Humor is laughing at what you haven't got when you ought to have it. Of course, you laugh by proxy. You're really laughing at the other guy lacks, not your own. That's what makes it funny-The fact that you don't know you are laughing at yourself. Humor is when the joke is on you but hits the other fellow first-Because it boomerangs. Humor is what you wish in your secret heart were not funny, but it is, and you must laugh. Humor is your unconscious therapy† (Hughes, 1966) Laughter for centuries has been the medicine that has helped to ensure the survival of African Americans. â€Å"Herded together with others with whom they shared only a common condition of servitude and some degree of cultural overlap, enslaved Africans were compelled to create a new language, a new religion, and a precarious new lifestyle. † (Joyner, 1984) As Africans were unloaded by boat and placed onto plantations, slave masters were completely enthralled by the way they spoke, moved, and danced. Out of slavery emerged a culture that would influence America's mainstream culture for infinity. Slavery created bondage for Africans and when it looked like they were going nowhere fast; they laughed, sang, and amused one another with riddles, jokes and animal tales from the homeland. Slave masters could not conceive why slaves in such a miserable state were so joyous, what they did not know was many of the songs, jokes and riddles were more than surface deep and many times about the master. The slaves made the best of the circumstances through humor and by laughing at the way the slave master treated them and their reaction to this treatment. They were laughing at the slave master and at the same time laughing at themselves. However, it did not take long before slave masters made slave merry-making public. Many times slaves were called upon to entertain master and their guests. Slave merry-making was also encouraged because it also increased the price of the slaves. â€Å"People took notice to the way slaves spoke and moved, out of slavery evolved Blackface Humor. (Watkins, 1994) Blackface comedy was when a person (white) painted their face with black makeup and acted like a slave (Sambo). Blackface humor gave whites the chance to lift African American Humor from its original context, transform it, then spotlight it as their own entertainment, amusement (for non-black audiences) it became popular for it is supposed originality. As blackface entertainment became more popular so did the actors. George Washington Dixion introduced â€Å"Coal Black Rose† (Watkins) one song â€Å"Sambo and Cuffee†, (Watkins) was a comic song about a black woman and her lover. Dixion performed this act all over the world; some would argue that Dixion was the first white blackface performer to establish a broad reputation. By the 1830's, blackface performers were everywhere becoming one of the most popular attractions of the American stage. Billy Whitlock, Frank Brower, Frank Pelham and Dan Emmett were also very popular blackface performers. Dixion created the one man, show but these men created a troupe of blackface performers. They also firmly established the image of blacks as happy-go-lucky plantation darkies, outrageously dresses and ignorant. Although there were other blackface performers before them, these men were the only ones who could give a real show from the makeup to the costume. â€Å"By the 1840's blackface performances had reached an unprecedented level of national popularity. â€Å"(Watkins) There were many performance troupes, even professional juvenile troupes. Each followed a standard; they had a three-act presentation. The first act opened up with a walkaround where the entire troupe came out made up in face paint and dressed in suits. They than gathered in a semicircle to alternate comic songs and jokes. Here is a common type of joke many used; it is called; Mr. Bones: â€Å"Does us black folks go to hebbin? Does we go through dem golden gates? † Mr. Tambo: â€Å"Mr. Bones, you know the golden gates is for white folks. † Mr. Bones: â€Å"Well, who's gonna be dere to open demm gates for you white folks? † For many of the white people watching the show the most funny and exciting part was the joke telling. In the second act-the â€Å"olio or variety segment†- was the stump speech speaker. This occured when one member performed a comic, black version of a topic. Topics would range from, emancipation, women's suffrage, education or another current political or scientific topic. The goal was to show how blacks could not comprehend nor interpret sophisticated ideas. The third and final part of the show was a slapstick plantation skit, featuring song and dance with costumed men and women dressed as slaves. After the Civil War, blackface troupes hired on free black men and women to perform with them. White audiences became upset and angry at many troupes. After the war and emancipation – during the reconstruction period constitutional amendments were passed to assure civil rights and voting rights for former slaves and some blacks were elected members of the House and Senate; Whites wanted to be assured that blacks were still inferior and blackface troupes were not showing this by continuing to hire blacks. Therefore, audiences depleted, and many troupes that had incorporated blacks started to perform on circuits like the â€Å"Chitlen circuit,† which hit most black owned theaters. Blacks who were part of the troupes started to branch off and start their own troupes. In doing this, they altered the usual blackface performance routine. First, they altered song lyrics, instead of singing songs that downgraded blacks; songsters would play on white fears and mock them. Many blacks took off the face paint and introduced musical comedies. Black musical comedies made many black performers successful. White already loved black music so the musical comedy fit right into the market. Still many of these comedies were on the circuit, and confined to black theaters. It was not until later that musical comedies were featured on Broadway. When musical comedies appeared on Broadway â€Å"Lyles and Miller a very successful team created a whole new approach to the comedies. â€Å"(Watkins) They presented at the end of their acts a group of women who danced and sang with the stereotypical attitude many felt black urban women had. This simple addition astounded Broadway and critics raved. Eventually, every black troupe evolved to use this form. Black Musical Comedies took blacks to another level of comedy yet, they were unable to shake the sambo stereotypical image given to them by white blackface performers. Licensed radio was introduced in 1920, because of the low budget and inadequate facilities, news shows and music provided by local groups dominated the airwaves. By 1922, there were over 522 licensed stations and radio sales increased from $1million in 1920 to $400million in 1925. By 1929, one in every three homes owned radios ten years later there was a radio in almost every home. Radio was a medium where its listeners could hear concerts, comic monologues, sporting events and political speeches as they happened. â€Å"(MacDonald, 1981) Radio at first initially ignored blacks, as in the blackface performance days they were imitated by whites. In 1925, Freeman F. Gosden and Charles J. Correll a minor duo debuted as musicians on a radio station in Chicago. They played at this radio station for a while and later moved to a station owned by the Chicago Tribune. There they were approached by management about doing a broadcast edition to the comic strip â€Å"The Gumps. † The two refused the offer but suggested an alternative, a black – dialect show. Gosden and Correll made a series based on two black names â€Å"Sam ‘N' â€Å"Henry†, which would later become known as â€Å"Amos ‘N' â€Å"Andy†. Sam ‘N' Henry debuted on January 12, 1926 (Dunning, 1925-1976) The characters Sam and Henry still depended on the stereotypical images of blacks created during the blackface (minstrel) performance years. Blacks were superstitious, naive, easily influenced, lazy, ignorant and conniving. On March 19, 1928, three months after the â€Å"Sam† ‘N' â€Å"Henry† show had been cancelled, â€Å"Amos† ‘N' â€Å"Andy† mysteriously appeared on a rival station in Chicago. Gosden and Correll had come up with the idea presented it to the station and it was accepted. This show was far more successful than Sam and Henry; Amos N Andy was recorded and leased to forty other radio stations. In August 1929, Pepsodent became the first major sponsor of a black comedy show. Amos N Andy was the number one show in the country. By 1935, 70 percent of American home (40 million) listeners tuned in each night. Sayings from the show hit the streets â€Å"Ain't dat sumptin',† â€Å"Splain dat to me',† and â€Å"Holy Mackerel† became popular. Even with its popularity, the show had a down time. Radio stations modernized their broadcast methods; comedians were no longer forced to work without an audience. This is when variety shows begin to take the market. In 1943, Gosden and Correl returned to the air with a thoroughly revamped half an hour version of â€Å"Amos† ‘N' â€Å"Andy†. The show was performed before a live audience and featured an orchestra and chorus. â€Å"Amos† ‘N' â€Å"Andy† represented a breakthrough for black comedians on radio and television as well. Although one-person acts were not popular during the variety show period, Moms Mabley set the stage for many comedians that would come after her. Jackie â€Å"Moms† Mabley. Born in North Carolina in 1897, Mabley grew up in Cleveland Ohio, by the time she was sixteen she had became a stage performer. She began as a dancer and singer and dabbled in comedy. During the 1920's, she was performing on the chitlen circuit in Dallas, where another teams saw her act and helped her get better bookings. Like many performers, she appeared in skits with other performers at first. However, Mabley did not like this and she was one of the first comics to turn to monologue humor. She appeared on the stage with oversized clodhoppers, tattered gingham dresses and oddball hats she acted like a typical down to earth older black woman. Mabley worked with many performers but she did her best when she was alone. She was famous for her costume and her shuffle, she would sing some comical version of a popular song, tell stories or just stand there and the audience loved it. Mabley foreshadowed the shift to direct social commentary and stand up comic techniques that would dominate humor and comedians to come. Dick Gregory, Flip Wilson, Redd Foxx, Steve Allen, Richard Pryor, Whoopi Goldburg, Eddie Murphy, and many other popular black and white comedians have evolved from the history of comedy. The images that were passed on from slavery still thrive at the root of jokes many comedians of today tell. Black comedians have finally gotten away from the white interpretation of black humor and created original black humor from an African American perspective to the world. Black comedy has come to be the voice of the struggle, pain, and joy African American people have gone through and are continuing to going through. Humor will continue to be a driving force to bring people of all ethnicities together to laugh at the good and bad times of our country. Without humor, would we really survive?

Thursday, January 9, 2020

Impacts From Each Circle Of Hell - 940 Words

Impacts from Each Circle of Hell Dante Alighieri’s The Inferno, translated by John Ciardi, is an epic poem based on Dante and Virgil’s journey through hell. Lucifer was an angel in heaven and God’s right hand man. He wanted to be equal to God and wanted to have as much power and all the respect that God had obtained from all of the other angels. After God found out about how he was trying to gain more power he sent him below the Earth’s surface. This is where hell resides. Dante was inspired to write this book after being exiled from Florence. The Stanford Encyclopedia stated about Dante’s life, â€Å"he never returned to Florence, and played no further role in public life, though he remained passionately interested in Italian politics, and became virtually the prophet of world empire in the years leading up to the coronation of Henry VII of Luxemburg as head of the Holy Roman Empire† (1312). One inspiration was because of the political nigh tmare Florence was facing. Before the fourteenth century the church and state were not separated. Throughout this epic poem Dante shows his major theme of how the state and church should be independent from each other but have equal powers. Another way Dante shows his hatred for the government is by identifying significant political figures at that time period in his journey through hell. Canto five and thirteen show how the sinners impacted Dante, questions that arise from the encounter, and insight to Dante’s main themes of his epic poem.Show MoreRelatedDante Aligheris Divine Comedy963 Words   |  4 Pagespious man whose own experiences in a corrupt society shaped his writing style and the symbolism he included in his stories. There are graphic details of each circle of hell by describing the appropriate judgement of each sin. In essence, the condemned are those who ignored with God’s laws and eluded His spirit. He describes the different realms of Hell and always descripts the emotions he is feeling in order for the reader to understand the severity of what he has witnessed. The comedy is supposed toRead MoreWhy One Sin Is Worse Than Another1629 Words   |  7 PagesMost Christians these days see every sin as equally bad. In other words, no one sin is worse or should draw worse punishment than another. In Dantes The Inferno, however, this is not the case. In The Inferno, the deeper one delves into Hell, the worse the sin that has been committed. The punishments that the souls incur are representative of the sins they committed in their corporeal state of being. Sins that affect others are considered worse then those that only affect ones self by Dante. TheRead MoreThe Importance Of Dantes Inferno1449 Words   |  6 PagesThroughout Dante’s epic poem, Inferno, history has played an integral part in its storyline. By knowing the history, it is evident that the conflict between the Guelphs and the Ghibellines had an immense impact on t he writing and the points made throughout. The issues of the time show themselves through the suggestive undertones and the scenarios Dante experiences. Accordingly, the text remains relevant due to the many lessons that can be learned through the sins of the people at the time relativeRead MoreReligion In Dantes Inferno By Dante Alighieri And Song Of Roland1502 Words   |  7 Pagesand the follower are considered saints and believe in heaven and hell and confessions of sins (Harkey, 2017). Religion is so important in these two works because it is what both of the main characters livelihood thrived on. First let’s discuss Inferno. The middle ages were coming to their end. The Crusades had wakened Europe from her sleep of centuries; the classic literature had begun to attract its devotees; the free cities. From that came Dante’s inspiration to write his acclaimed work entitledRead MoreAnalysis Of Dante s Divine Comedy 1362 Words   |  6 Pagesfind behind Dante’s work of art was the psychological and geographical impact. ‘The Divine Comedy’ contains real life morals; Dante’s work is far more than just a spiritual afterlife poem series. For example, he shows that any journey is full of trial and error, various beliefs, and hidden lessons. The divine comedy consists of three parts; ‘The Inferno’, ‘Purgatorio’, and ‘Paradisio’. The inferno is all different levels of hell and all of the seven deadly sins. Dante meets his guide virgil here whoRead MoreFamous Authors778 Words   |  4 Pagessomeone who likes to read the classics, such as Shakespearean plays or a prose written by Edgar Allan Poe. Some of the oldest tales have greatly influenced the world as we know it in many ways. A few famous people in history who have made a positive impact on the world of literature are Aesop, Dante Alighieri, and Charles Dickens. Some may say that Aesop is infamous for the life he led over 2000 years ago and mostly for the hundreds of fables that have been attributed to his name since. It is saidRead MoreDante Alighieris Road to Success Essay850 Words   |  4 Pagesmany moral ideas. In this tale, a man named Dante the Pilgrim takes a voyage into Hell with Virgil. They go through many different levels and circles of Hell that each represents an important sin or crime. Though it is a highly controversial book, there is one specific passage that is important to finding success. In this quote, Virgil is speaking to Dante who has tired along his path through the seventh circle of Hell: â€Å"‘Up on your feet! This is no time to tire!’ my Master cried. ‘The man who liesRead MoreThe Medieval Islamic Attitude Towards Christianity1562 Words   |  7 Pagesgrounded in paganism, and Muhammad had begun to be perceived as inspired by Satan; in order to demonstrate this, medieval academics and missi onaries even went so far as to change the date of Muhammad’s death from the year 632 to 666 – the number of the Devil. Not only this, but the changing of his name from Muhammad to ‘Mahound’, which literally means â€Å"the devil incarnate†, was a verbal expression of the distaste Christians had for Islam. European portrayals of Muslims, and the Eastern world in generalRead MoreCultural Themes In Dantes Inferno1861 Words   |  8 PagesComing from the Dantes Divine comedy, one of the three canticles in which is mainly the most famous one and perhaps the one with the most meaning behind it. It perfectly explains so much about the era and time that Dante lived in and perhaps even some of his background can be felt and seen during the whole story itself. From a Florencia decent, especially in the middle age civilization, some of his hate or perhaps just cultural feelings of dislike towards some of the Greek, warriors and even topRead MorePlato s Apology And Crito983 Words   |  4 Pageshave marked the history of humanity on earth. In the following written works, Plato’s Apology and Crito, The Gospel According to Mark, and Dante’s Inferno, religion and politics are shown to be intertwined, which emphasizes the impact of each individual character in each written work. Also, these written works explain how politics are affected by religion and vice versa. Plato’s Apology and Crito are plays that explain how Socrates, who was considered an honored and the wisest man in all of Athens

Wednesday, January 1, 2020

What is Gorilla Glass Composition and Facts

Gorilla Glass is the thin, tough glass that protects cell phones, laptop computers and millions of other portable electronic devices. Heres a look at what Gorilla Glass is and what makes it so strong. Gorilla Glass Facts Gorilla Glass is a specific brand of glass manufactured by Corning. Presently, the world uses the fifth generation of the material, which has been improved over the years. Compared to other types of glass, Gorilla Glass is particularly: HardThinLightweightScratch resistant Gorilla Glass hardness is comparable to that of sapphire, which is 9 on the Mohs scale of hardness. Regular glass is much softer, closer to a 7 on the Mohs scale. The increased hardness means youre less likely to scratch your phone or monitor from daily use or contact with other items in your pocket or purse. How Gorilla Glass Is Made The glass consists of a thin sheet of alkali-aluminosilicate. Gorilla Glass is strengthened using an ion-exchange process which forces large ions into the spaces between molecules on the glass surface. Specifically, glass is placed in a 400Â °C molten potassium salt bath, which forces potassium ions to replace the sodium ions originally in the glass. The larger potassium ions take up more space between the other atoms in the glass. As the glass cools, the crunched-together atoms produce a high level of compressive stress in the glass that helps protect the surface from mechanical damage. Gorilla Glass Invention Gorilla Glass isnt a new invention. Actually, the glass, originally named Chemcor, was developed by Corning in 1960. At that time its only practical application was for use in racing cars, where strong, lightweight glass was needed. In 2006, Steve Jobs contacted Wendell Weeks, the CEO of Corning, seeking a strong, scratch-resistant glass for Apples iPhone. With the success of the iPhone, Cornings glass has been adopted for use in numerous similar devices. In 2017, over five billion devices incorporated Gorilla Glass, but there are other products with similar properties that compete in the global market. These include sapphire glass (corundum) and Dragontrail (an alkali-aluminosilicate sheet glass made by Asahi Glass Co.) Did You Know? There is more than one type of Gorilla Glass. Gorilla Glass 2 is a newer form of Gorilla Glass that is up to 20% thinner than the original material, yet still as tough. Gorilla Glass 3 resists deep scratches and is more flexible than its predecessors. Gorilla Glass 4 is thinner and more damage resistant. Gorilla Glass 5 was introduced in 2016 for use in the Samsung Galaxy Note 7. Gorilla Glass SR was also introduced in 2016, for use in the Samsung Gear S3 smartwatch. More About Glass What Is Glass?Colored Glass ChemistryMake Sodium Silicate or Water Glass